AJ | Pact | logo
PhD write-up diary

PhD write-up diary

1 September 2011

CreativePact (6 posts)

With the hand-in of my PhD not far away I have decided to spend this year's CreativePact writing a diary of its completion.

Swapping timers for observation/notification


Today I'm going to continue documenting my programming. Before I start on this just to mention I also worked on the mix of 'Futures', the second track of Futures EP. The order of the day was vocals, and I wasn't quite sure what I was going for mix-wise but the results are a good starting point.

It's programming I want to talk about though, and in particular I want to continue singing the praises of the AVFoundation framework, which I rewrote the audio engine of my app with yesterday. Today I sorted out some other details, starting with the volume slider.

Originally I had subclassed UISlider to make my own custom-design slider for volume, but AVPlayer's volume can't be used directly with a UISlider. Instead, AVPlayer uses the system volume control, accessed through an instance of the MPVolumeView class. With this in mind I decided to subclass MPVolumeView using the same graphics as my previous UISlider subclass, and the benefit is that the control's link is opaque: no need to connect through IBAction methods in the view controller.

Next was to configure the audio session, which governs how my app plays with other apps and the main OS when it comes to audio. My app uses AVAudioSessionCategoryPlayback, which allows it to silence other playing apps. I had to remember to enable background audio too, done by adding a UIBackgroundModes key to the app's info.plist. Handling interruptions from other apps - in particular if an iPhone receiving a phone call – was part of this process.

Finally, the biggest upgrade came with the replacement of my tottering NSTimer system for updating the UI. This I implemented back when the app played audio using AUGraphs and was annoyingly complicated: a timer, running every fifth of a second, triggered updates to timing labels, referencing itself from the start of playback.

This approach is complicated because I decided to implement the timer on a separate thread, using NSThread and other methods to handle calls between threads. Also, NSTimers are really unhelpful: you can’t pause them, you have to invalidate them and create new ones; also they just trigger, leaving the programmer to mark time using NSDate instances. Figuring out pausing with my app was a nightmare with this approach.

When implementing the audio session I came across an interesting looking AVPlayer method - addPeriodicTimeObserverForInterval:queue:usingBlock:. This method runs a block of code to a programmer-specified interval when the AVPlayer is playing. I decided to take a look, and to my surprise found a system which could completely replace my NSTimers. When I finished the switch I had cut 200 lines of complicated, cross-thread code from my view controller.

To make this work I had to run messages back from my media instance (a singleton, setup using Cocoa with Love's singleton macro) to my view controller. Normally this would be a headache, but not with Cocoa. This is because Cocoa allows for Key Value Observation: basically you register an object as an observer of an instance variable; when a specific action is invoked on that variable – say it is changed – a specific method is called. And observation may take place across objects and requires no polling (at least by the programmer).

So I have set my view controller to observe an instance variable storing the AVPlayer’s current playing time, which is updated through the media singleton AVPlayer's addPeriodicTimeObserverForInterval:queue:usingBlock: method every half second, which thus triggers the update of the user interface through KVO, with a call to the AVPlayer's actionAtItemEnd notification to handle resetting at the end of the tune.

Brilliant. So brilliant in fact that I'm gobsmacked about the AVFoundation framework, it truly is a fantastic API for handling media on iOS.