Software Engineer

I am a Software Engineer. I have a Bachelor (Honours) of Science in Information Technology from the University of Sunderland - Class of 2003. I have been developing software since 2001 when I was offered a role at CERN as part of their Technical Student Programme.

By 2016 I had grown really tired of the software industry and by the end of 2019 Apple killed whatever excitement I had left. I am not sure what the next 10 years will bring. What I do know is that my apettite to do work that is impactful has only grown bigger and stronger. Great people make me tick more than anything.

I am also tired.

Sound Debugging

Interview at NSBrief: #123: NSConference 6 with Saul Mora

Have filed bug report under rdar://16341784 See openradar.appspot.com/radar?id=5346823308836864*

Xcode’s ability to play sounds has been underused. Sound patterns can help debug notifications and visualize execution on unfamiliar code.

In this series of posts will explain what sound debugging is, show where it is useful, how it compares to traditional methods (e.g. logging) and bounce a few ideas on how it might help with testing, measure performance, debug concurrency and the teaching of algorithms.

Breakpoints that play sound

In case you didn’t know, XCode has the ability to play a sound on a breakpoint. Combined with the option to “Automatically continue after evaluating” you can effectively create a score based on code execution.

The idea behind sound debugging is based on the timing of these sounds; i.e. for an execution path with two breakpoints, you get this sound:

Let’s name those two sounds “funk” and “hero” respectively. Now, the sequence of two sounds doesn’t tell you much without knowing the code they are associated with. What do those sounds actually “mean”?

Here is the cheat sheet.

funk: an instance of QNDSubject was created.
hero: an instance of QNDSubject was notified.

Now you can “visualise” code. Whenever you hear those two sounds, you know that “an instance of QNDSubject was created” and (by being the only instance) that “was notified”.

Both without taking a look at the source code. Let’s look at the two parts of source code that the score relates to.

First the QNDSubject class where the instance is created.

@implementation QNDSubject

+(instancetype)newSubject
{
    QNDSubject *subject = [QNDSubject new];
    [subject wasCreated];

    [[NSNotificationCenter defaultCenter] addObserver:subject
                                             selector:@selector(wasNotified)
                                                 name:QND_NOTIFY_SUBJECT
                                               object:nil];
    
return subject;
}

-(void)dealloc
{
    [[NSNotificationCenter defaultCenter] removeObserver:self
                                                    name:QND_NOTIFY_SUBJECT
                                                  object:nil];
}

-(void)wasCreated
{
    {}
}

-(void)wasNotified
{
    {}
}
@end

Second, the notification.

    QNDSubject *subject = [QNDSubject newSubject];
    
    [[NSNotificationCenter defaultCenter] postNotificationName:QND_NOTIFY_SUBJECT object:nil];

Even though the code base is relatively small in this case, it still requires more than one look to understand its execution.

  • Find the class whose instance you are interested in.
  • Look for the code that creates this instance.
  • Is the QNDSubject#wasCreated called?
  • Does it register for notifications?
  • Does the notification fire?

No matter how “clean” code is, it is still hard to read between the lines.

There are two reasons in this case why a sound clip works better in visualising code than actually looking at code itself.

  • The time taken to listen to the sound clip is smaller than the time taken to read the code.
  • The proximity of the two sounds is closer compared to every line of code that makes up the execution.

Now, in such a small code base, logging would likely help as much 1.

Where sound debugging really shines is in large 2, unfamiliar codebases (or even a week? old) where the need to track multiple points of execution is required. Even more so, when such code is mostly based on async execution and is loosely couple using delegates and notifications.

Some examples where sound debugging can help you:

  • Order of execution.
  • An instance is being reused or created each time.
  • A tap is handled more than once.
  • An object is destroyed.
  • A timer is invalidated or not.
  • An instance no longer listens for notifications.

A second part will follow.

Source Code

Origin: Had to come up with sound debugging when needed to debug an app where one operation “sometimes” timed out before another one had been executed.

Had just been brought in an unfamiliar and vast code base, involving lots of complex, async code and notifications. Knowing only two methods, (#didTimeOut and #executed) placed a breakpoint in each and listened.

After a couple of executions, was confident 3 that the timeout always came first. That gave a lead as where to look at and to the eventual fix.

Being in an unfamiliar code base found it easier to use sound breakpoints as stepping stones rather looking at method names to get a better understanding of what’s involved in the tap of a button.


  1. Even then, logging is still slower for your brain to process and also more prone to errors to your eyes; i.e.
    wasCreated
    wasNotified ↩︎

  2. As the code base grows, the practicalities of logging diminish. Your eyes might require to constantly jump in and out of the logs and all too often you need to indent logs so that they stand out. ↩︎

  3. When dealing with async/concurrent code you can “never” be sure of timings that are outside your control. ↩︎