I am working on an app with a photo section and a Collection View. The images in the cells are blurry. They are sharp when viewed full screen and on a different screen as thumbnails in a table view. I have an NSManagedObject subclass called Photo that imports Photos, references PHAssets, and contains methods called getAsset() and getImage(size).
Here is the code for my UICollectionViewCell subclass:
class PhotoCollectionViewCell: UICollectionViewCell {
@IBOutlet weak var imageForCell: UIImageView!
@IBOutlet weak var labelForCell: UILabel!
func setImage (photo: Photo?, size: CGSize) -> Bool {
if photo != nil {
if let image = photo?.getImage(size) {
var label = photo?.getLabel()
imageForCell.contentMode = UIViewContentMode.ScaleAspectFit
imageForCell.image = image
labelForCell.text = label
return true
}
}
return false
}
From cellForItemAtIndex in my UICollectionView subclass:
if let photo: Photo = listOfPhotos[indexPath.item] as? Photo {
var size: CGSize = CGSizeMake(500, 350)
if !cell.setImage(photo, size: size) {
// Handles a missing photo.
}
}
2 Methods from Photo NSManagedObject subclass finding a photo based on its creation date:
func getAsset() -> PHAsset? {
let fetchOptions = PHFetchOptions()
fetchOptions.predicate = NSPredicate(format: "creationDate = %@", photoDate)
let assets = PHAsset.fetchAssetsInAssetCollection(assetCollection, options: fetchOptions)
var asset: PHAsset?
if assets.count == 1 {
asset = assets.lastObject as? PHAsset
}
else {
// Handles invalid result
}
return asset
}
func getImage(size: CGSize) -> UIImage? {
var image: UIImage?
if let asset = getAsset() {
PHImageManager.defaultManager().requestImageForAsset(asset, targetSize: size, contentMode: PHImageContentMode.AspectFit , options: nil) { (result, info) -> Void in
image = result
println("P gI: Image gotten. Image Size: \(image?.size), Size: \(size), Date: \(asset.creationDate)")
}
}
else {
}
return image
}
The println() statement in getImage showed me that this is getting each image twice in different sizes, neither of which is the specified size. The smaller size for each cell is fetched first, followed by a larger size for each cell. I suspect that the smaller image is blown up to fill my cell (which I think would make it blurry) and it isn't loading in the larger size (which I think would be clear). Logging output:
P gI: Image gotten. Image Size: Optional((64.0,48.0)), Size: (500.0,350.0), Date: 2014-12-04 18:56:20 +0000
P gI: Image gotten. Image Size: Optional((64.0,48.0)), Size: (500.0,350.0), Date: 2014-12-04 18:56:36 +0000
P gI: Image gotten. Image Size: Optional((2048.0,1530.0)), Size: (500.0,350.0), Date: 2014-12-04 18:56:20 +0000
P gI: Image gotten. Image Size: Optional((2048.0,1530.0)), Size: (500.0,350.0), Date: 2014-12-04 18:56:36 +0000
In my storyboard, both my imageView and cell are set for 250 x 175. My imageView is set to Aspect Fit in the inspector.
The Photos framework fetches the smaller size first for speed and then it fetches the larger image size.
You can tell it to fetch only the larger image by setting the request options:
see
PHImageRequestOptionsDeliveryMode
for more options.