Working for a company that deals with photographs, I’ve spent a bit of time dealing with various aspects of metadata within images. For the most part, accessing this data isn’t a problem. Many languages have built-in EXIF reading capabilities, and there are some fantastic open-source libraries available if not. As well as the technical considerations, a big concern surrounding metadata is that of privacy. There’s a certain level of responsibility you must assume when dealing with metadata; errant GPS coordinates, for example, is a security concern that many users won’t think about when sharing their photos with the wider world.
Sharing media is big business nowadays, and consistent metadata is increasingly important – social media sites grow from linking together information in ever-increasing ways. Considering Apple is throwing bazillions of dollars at portable devices capable of not only capturing but sharing images and movies, you’d think their approach to metadata – like their approach to many other aspects of iOS – would be fairly comprehensive.
Unfortunately, it is anything but. From low-level functionality all the way up to App Store review, Apple’s manner of dealing with metadata is little short of a nightmare.
The fundamental problem with iOS’ approach lies within its model for dealing with 3rd party apps. In iOS, Apple’s native apps have – as you would expect – access to private APIs within the system. On the other hand, 3rd party apps exist within a sandbox that don’t have access to these APIs. This security model is fundamental to iOS and is (in my opinion) one of its greatest strengths.
The ability to read to and write images from the device is one area that bridges both private and public APIs. When you take a photo with the native Camera app, various information is written into the underlying image data that’s stored in the Camera Roll. If you subsequently copy that image from the roll and paste it into an email composed in Mail, the image that arrives in the recipient’s mailbox will contain all that metdata intact. Nice ‘n easy.
Unfortunately, doing something similar with 3rd party apps doesn’t work. Although your app can utilise a public API (
UIImagePickerController) to take a picture using the camera, there’s no way (prior to OS4, which I’ll get to in a minute) to write the data to the image. Similarly, accessing images from the roll gives you a representation of the image, rather than the image data itself – in other words, you get the pixel data only, regardless of any metadata the original image may have.
The app store review process has been (quite rightly) torn to pieces by the majority of developers who have had to tangle with it. I’ve not had a bad experience with it personally, but I highlight it here as various apps have circumvented the rules allowing them to access metadata. Without going into detail, these apps use a private API call to scan the disk directly, bypassing the public API and thereby accessing the original images.
I don’t have a problem with the developers of these apps per se (although I’m of the opinion that using private APIs isn’t a good idea, no matter what functionality it may give you). Its just simply frustrating that Apple seems to blatantly turn a blind eye to high-profile developers while enforcing the rules for anyone else.
The principle of denying metadata to 3rd party apps was understandable, but what we really needed was a way of requesting permission to access metadata, directly from the user. Of course, when iOS4 was released there looked to be a shiny new framework,
ALAssetsLibrary, to do just this. But of course it ain’t that simple.
The Asset Library offers a different way of accessing images in the Camera Roll. Instead of providing a list of thumbnails to pick from, the API gives programmatic access to all media on the device (video included). Before an app gains access to assets via the framework, the user is shown a prompt to accept or deny. Sounds good, but unfortunately ALAssetsLibrary is flawed in four separate ways.
Problem #1 is that permission is tied to Location Services. I’m assuming that, because the native Camera app (and 3rd party apps in OS4+) may embed GPS data to images, Apple’s dev team thought it sensible to ensure the same permission governed access to that GPS data. Unfortunately this isn’t good enough, as its quite reasonable users might want to deny current GPS data to an app while still extracting previously-stored GPS data from images.
Problem #2 is that denying access doesn’t mean you don’t get metadata; you don’t get access to the image whatsoever. Ultimately, there’s no way to retrieve images via the Asset Library if the user disables Location Services for your app, even if the app doesn’t touch GPS data at all.
Problem #3 – its slow and buggy. Looping through groups of images happens multiple times per group, image thumbnails are rendered extremely slowly (I think this bug is actually with the UIKit framework, but the effect is compounded when trying to build a custom image picker showing all the user’s thumbnails at once).
Problem #4 is that, while you may be able to explain all this clearly to your users, Location Services’ default warning makes it impossible to do so. The text presented to the user is “‘Your app’ wants to access your current location.“. This is completely misleading, as the Asset Library has nothing to do with your current location.
As a final kick in the teeth, there are reports of Apps being rejected because of this misleading warning. Some reviewers seem to be unaware that the Asset Library requires the Location Services permission, and have rejected apps on the basis that they ask for Location Services to be enabled but don’t offer any location-aware functionality.
The words of another disgruntled developer sum it up:
we are stuck with either potentially angering our users by asking for permission for something we don’t even use [, ] denying them the ability to record video if we are denied location access, or continue to have a previous version written for iOS 3.0 that is completely broken when the user runs it on iOS 4.0.
Ultimately, Asset Library under 4.0 just isn’t usable. Even if the bugs are ironed out, which is likely, the framework as it stands just isn’t suitable for dealing with media in a sensible manner. I’m aware that this is just one small part of a rapidly expanding operating system, but that’s possibly part of the problem. It just feels rushed & ill-conceived, and I’m concerned it could be the first of similar issues to come.
...don’t talk about it (as a beta, its under NDA). If you’re enrolled in the Developer Program you can read all about 4.1, but suffice to say things aren’t looking promising at this point. Since its unlikely the Asset Library’s permission system would be altered after a major release, I’m hoping that the standard image picker might have a rehaul – we’ll wait and see.