HipStyles was removed from the App Store over a year ago, but from time to time I still think it could be resurrected. However, after taking a long hard look at Apple's new Photos framework in iOS 9, I think it is just not feasible anymore.

HipStyles and the Assets Library Framework

HipStyles was a photo finder for Hipstamatic and Oggl photos, looking for snaps taken with some Hipstamatic lens, film or flash of your choosing. It was developed because it was increasingly difficult to find and categorize the great shots you took with the many virtual vintage films and lenses available in Hipstamatic. They still publish new gear monthly, and Eric Rozen's Hipstography does a great job in covering them.

Technically, what HipStyles did was look at your iPhone's photo album, pull out certain metadata from the photo, and present the results. When Hipstamatic saves a photo, it writes the gear information in some fields in the {TIFF} metadata dictionary. You can observe this yourself if you use some metadata reader application on a Hipstamatic photo.

Using the old Assets Library Framework this was relatively easy, if not too fast. Basically you would just get an instance of the assets library, and then loop through the photo assets and act on the information:

[self.assetsLibrary enumerateGroupsWithTypes:ALAssetsGroupAllPhotos usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
    [group enumerateAssetsUsingBlock:^(ALAsset *result, NSUInteger index, BOOL *stop) {
        NSString *assetType = [result valueForProperty:ALAssetPropertyType];
        if ([assetType isEqualToString:ALAssetTypePhoto]) {
            ALAssetRepresentation *repr = [result defaultRepresentation];
            NSDictionary *metadata = [repr metadata];
            // Do something interesting with the metadata...
        }
    }
}
failureBlock:^(NSError *error) {
    NSLog(@"Failed to enumerate the asset groups.");
}];

This method is also detailed in the Apple Technical Q&A 1622.

Introducing the Photos framework

In iOS 8, Apple introduced the Photos framework, which in many ways is a lot better than the old Assets Library. Some good general descriptions of it have been published by objc.io, iOS 8 Day-by-Day, and NSHipster.

In practice, it is now a lot easier to load photos from the user's photo library, which could now be completely or partially in the cloud. If you have turned on the iCloud Photo Library feature in iOS, the master copies of your photos will be in cloud storage, and the Photos application (or some app that you developed yourself) will manage the downloading and caching of the assets as necessary.

For details, see Apple's example application using the Photos framework.

It is also now easier to get at the most commonly used photo metadata, such as creation and modification dates, the location where the photo was taken, and the pixel size of the image.

This is roughly how you work start retrieving the photos:

let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)]
fetchOptions.includeAssetSourceTypes = .TypeUserLibrary

let fetchResults = PHAsset.fetchAssetsWithOptions(fetchOptions)
var assets = [PHAsset]()
fetchResults.enumerateObjectsUsingBlock {(object: AnyObject, count: Int, stop: UnsafeMutablePointer) in
    assets.append(object as! PHAsset)
}

Once you have the assets, you would then go on picking stuff from the metadata.

Initially it sounded like a no-brainer to just make HipStyles use the new Photos framework. However, the Photos framework does not expose all the metadata that HipStyles needs. Specifically the {TIFF} metadata is not available without getting all the image data first.

When you get the image data, it is possible to extract the metadata. This is a function to handle the results of an image request:

func imageDataRequestResultHandler(imageData: NSData?, dataUTI: String?, orientation: UIImageOrientation, info: [NSObject : AnyObject]?) -> Void {
    let requestID = info![PHImageResultRequestIDKey] as? Int32

    if let data = imageData {
        if let ciImage = CIImage(data: data) {
            if let metadata = ciImage.properties as? [String: AnyObject] {
                // Do something interesting with the metadata...
            }
            else {
                print("\(requestID): no metadata available")
            }
        }
        else {
            print("\(requestID): no image data available")
        }
    }
}

On the surface, it sounds simple: just get the image data for each image, extract the metadata, and do whatever you need with it. There is just one problem, and it's a big one: you need to get all the image data first. All of it.

Consider a user that has a couple of thousand photos in their photo library, and they have decided to store all their photos in iCloud. Each of these photos could be multiple megabytes in size. We don't know which of those photos are interesting for HipStyles, and we can't know unless we get the image data of each and every one candidate. So you will create a PHCachingImageManager, set its options, configure some image request options, and send the image manager on a mission:

let imageManager = PHCachingImageManager()
imageManager.startCachingImagesForAssets(assets, targetSize: cellSize, contentMode: .Default, options: nil)

let imageRequestOptions = PHImageRequestOptions()
imageRequestOptions.deliveryMode = .FastFormat
imageRequestOptions.synchronous = true
imageRequestOptions.networkAccessAllowed = true

for asset in assets {
    imageManager.requestImageForAsset(asset, targetSize: cellSize, contentMode: .Default, options: imageRequestOptions, resultHandler: imageRequestResultHandler)
}

Because some or all of the photos could be in iCloud instead of the user's phone, you need to allow network access.

You can also set a request progress handler before requesting the images:

imageRequestOptions.progressHandler = imageRequestProgressHandler

func imageRequestProgressHandler(progress: Double, error: NSError?, stop: UnsafeMutablePointer, info: [NSObject : AnyObject]?) -> Void {
    print("\(progress)")
}

Then you wait… and wait… and wait. The image manager will happily retrieve all the photos from iCloud over the network. I don't know about you, but my vintage iPhone 5 from the year 2012, with 32 GB of storage and full of apps, otherwise happily running iOS 9.2.1 with no problems, will freeze and reboot when storage gets absolutely full. So I experienced a couple of crashes when I tested the photo retrieval code above, because the image manager dutifully filled my storage with local copies of the photos from iCloud. All because of me wanting to get at the some photo metadata that was not available from the PHAsset class.

You can make the metadata retrieval easier for you if use a helper like PHAsset+Utility, but I don't think it will make the root problem go away: to get at certain pieces of photo metadata, you need to get all the image data first.

In iOS 9, the Assets Library Framework is deprecated in favour of the new Photos framework. There is obviously no guarantee about how long the old Assets Library framework will be operational anymore, and anyway it doesn't handle the iCloud photo library use case at all.

If you have any ideas, by all means send them my way. But at this point I'm convinced that it is not feasible to recreate HipStyles using the Photos framework.