Category Archives: Objective C

Am I a grumpy old git, or are we trying to kid the next generation that coding is really easy?

Someone coding

I’ve been a programmer for over 30 years. That’s not a boastful statement (at least it’s not meant to be) I’m simply saying that I’ve clearly been around the block a few times. This article is about coding and specifically why we’re pretending to young people that’s it’s an easy thing to do.

So the purpose of this post is to complain, in a somewhat grumpy-old-git kind of fashion, about initiatives like “The Year of Code” and, perhaps to a lesser extent, the “BBC Micro Bit”. But more about why I want to whinge about those in a moment. First of all, a history lesson.

A life of code
As a developer, it’s important to speak lots of languages. Not the ones where you battle with a phrase book to order a cup of coffee in a foreign land and end up asking to sleep with the barista’s sister. No, I’m talking about coding languages.

I started off, as many do, with BASIC. Beginner’s All [Purpose] Symbolic Instruction Code. It’s deliberately simple to use and understand (it’s not called BASIC for nothing) and has always been a good way for budding coders to discover what communicating with a computer is all about.

My BASIC experience started on a Sinclair ZX-81. It had 1KB of RAM, although I plugged in a rather wobbly 16KB ram pack to boost this up by quite a margin. In those days, all you had to look at was a flashing cursor on a black and white screen and there was no audio. You learned very quickly that coding can, at times, be quite difficult; that you need to be patient and methodical and instant gratification is unlikely until you’ve committed to several hours of work and solid debugging. It’s a salutary lesson in the kind of personal attributes that you require in order to be a successful programmer. If you’re the kind of person who gets frustrated easily, then coding won’t suit you. Surely, as we’ll discuss in a moment, it’s a good idea to find that out sooner rather than later?

I found that I loved coding. It didn’t always go to plan. I couldn’t always get the computer to do what I wanted it to. But that didn’t matter to me. It just made me want to discover more so that I could make the device dance to my tune.

From BASIC I went through a number of languages on different systems. Some I used for a while, some I still use parts of and yet more are long forgotten. I wrote an integrated Office suite from scratch in Pascal for my A Level Computing Science. On my Commodore Amiga I dabbled in an interesting coding language called FORTH (I wonder what happened to that). And, of course, I’ve used C, C++ and in more recent years Objective C, Swift, PHP and more database and web based technologies than seems sensible to list here without boring you ridged.

Each language has its own peculiarities, advantages and disadvantages. Sometimes you have no choice what language to use. For example, for many years Objective C was (almost) the only answer for iOS and Mac OS X development. That changed in 2014 and I’ve recently started using Apple’s new Swift language and am really enjoying it. One of the attributes I mentioned earlier is being open to change like this. The shifting sands of technology. Sometimes those sands will shift so quickly that you’ll have to change the way that you work completely. That’s just the way it is in coding, and it keeps us geeks on our toes!

After those 30 years (and then some) I’m still discovering, learning, failing (often) and debugging constantly. And I love it. Because I’m the right kind of person to be a programmer.

Year Of Code

So time for my moan
There’s an initiative that’s been running for a while now called “Year of Code”. I don’t claim to know much about it, but what I do know is that it encourages young people to get into computer programming. Or at least that’s its intention. There are other initiatives, such as the recently launched BBC Micro Bit, that aim to do similar things.

Let’s get something straight from the start – I have no problem at all with trying to get people to code. In fact, I think it’s essential that we do; my complaint is how we seem to be going about it.

In 2014, The Year of Code with backing from the British government appointed a figurehead for the UK’s campaign: Lottie Dexter. She’s one of those bubbly “can do” sorts that, I guess, the people in charge thought would appeal to the media. I have nothing against Ms Dexter and I don’t blame her entirely for the situation that she found herself in, however it was still mind bogglingly vomit inducing. I saw her interviewed on BBC Newsnight one evening where it turned out that she didn’t know anything about coding. She didn’t even know what it was. She was planning to “learn it over the next year” and claimed that teachers could “be trained how to educate students in computer programming in a day”.

Clearly, no-one had bothered to check if Ms Dexter had the necessary skill set to be a coder, because (obviously) anyone can do it. Anyone can be an Ada Lovelace or an Alan Turing.

Thanks very much for belittling my profession and hard earned years of experience. No, really, thanks.

This, in a nutshell, is what annoys me about initiatives like “Year of Code”. The total lack of understanding and the failure to accept that programming takes a certain type of person and many years to perfect (if, indeed, one ever does). Like every single profession, programming is the right job for some people and the wrong job for everyone else. These schemes all seem to think that absolutely everyone can code.

I’ve got news for you: They can’t.

And there’s more…
Around the same time as Lottie-gate, I remember watching an interview with a child, no more than about 10 years old. The interviewer was saying enthusiastically that the little darling was a programmer and had written their own game. “Isn’t it amazing!” gushed the TV airhead. Well, no it wasn’t amazing really. The “programming” in question was an environment with some elements that were dragged on to a virtual canvas and then you keyed a value into a handful of boxes on screen to decide what the “program” would do. No coding was harmed (or written, for that matter) in the making of that “game”. What that had proved, was that the child could type a number and use a mouse. That hardly constitutes coding in my book.

Once again, this is an example of sending a signal to children that coding involves a simple bit of drag and drop or, in the case of the BBC Micro Bit, a few (very) simple commands in order to make some lights flash. The BBC Bite Size page on coding ,incidentally, doesn’t even spell “program” correctly, but I digress (not for the first time in this article).

So what should they be doing?
What  really needs to be done is to demonstrate to children what programming really is, warts and all. I’m not suggesting that an 8-year old should be subjected to 6 weeks of intensive C++ object orientation techniques; however the use of fundamentals in coding in schools in order to find those who have the aptitude for it would be no bad thing. In all subjects you start with the basics.

Take physics, for example. Those in year 6 won’t be discussing the plausibility of multiple universes or the finer points of string theory; they’ll be playing with weights and discovering the fundamental principles of the laws of gravity. Similarly with coding, a language like LOGO would be a good introduction but alongside the basics of binary and solving logic problems. It’s not rocket science. Those who like it will be open to the more complex sides of coding while being under no illusion about what it’s really going to be like as a profession.

In summary
You may think that I don’t want young people to learn coding because they’ll take my job away from me. That couldn’t be further from the truth. As we become ever more reliant on technology, there will be plenty of coding opportunities for all of us. However my point is that coding isn’t something that just anyone can do, and no amount of poorly thought out initiatives are going to change that.

Coding is time consuming, it often involves working to tight deadlines, and it’s not unusual for projects to change completely during their development. Don’t be surprised if you’ve spent hours working on something only to find it canned and replaced by a new requirement at the eleventh hour. It happens. Spending many hours looking for one tiny bug that might be down to a single comma or a full stop isn’t unusual; you need to have the patience to have days that go like that. However, to see a web page, a computer or a tablet do something just because I’ve programmed it to work in that specific way never gets boring and there are plenty of other devices out there that need coding in order for them to come to life.

Just because I code for a living, it doesn’t mean that I no longer code as a hobby. I was looking for a gadget recently and couldn’t find the exact spec I wanted, so while the majority of people would prefer to purchase a finished product, I’ve decided to build and program one myself most likely based on a Raspberry Pi core. I’ll do it because I can. I’ll do it because I have the right kind of logical brain that will happily work through all the inevitable problems to get to a working solution. That’s the joy of coding for me. It won’t be easy, but it will be very rewarding.

And that’s the idea that we need to sell to those children who have the aptitude to do likewise.

I’m now posting my articles both on my blog and also to Medium. You can read this article on Medium here. Please follow me on Medium here.

Why I won’t be upgrading to iOS 7 next Wednesday

iphone5s5cUnless you’ve been hiding under a rock for the last week you’ll probably be aware that Apple announced 2 new iPhone models this week – the iPhone 5c and iPhone 5s (pictured above). Both launch on 20 September with the former taking pre-orders from today. Apple also announced that iOS 7, their major overhaul of the iPhone (and iPad) operating system will be available, worldwide, on Wednesday 18 September.

I won’t be upgrading.

Don’t get me wrong, I think the new OS looks fantastic and I back Jony Ive’s design 100%. Unfortunately for me, however, I can’t upgrade to the new OS because my ancient Mac Pro won’t support the tools required to develop for it.

XCode 5 requires Mac OS X Mountain Lion as a minimum spec. My first generation Mac Pro 1,1 won’t run the latest Mac OS so I’m stuck on OS X Lion until Apple see fit to launch the new “Darth Vader” Mac Pro sometime before the end of the year.

If any of the apps I’ve developed don’t work under iOS 7 I am (in a word) stuffed.

November
Realistically, I can’t see Apple launching the new Mac Pro until after the new version of Mac OS X (called Mavericks) comes out and rumours are pinning that at late October. That means a November launch (or even December) for the new hardware. There’s no telling how quickly I’ll be able to get my hands on one so it could even be next year before I can actively develop for iOS 7.

I’m not afraid to say that this prospect fills me with ten tonnes of the brown smelly stuff.

Ah well. There’s nothing I can do about it so I’ll just have to wait.

New iPhone
To end on a positive note, the announcement of the new iPhone hardware was good news. I like the look of the new iPhone 5s, especially it’s headline feature – a fingerprint sensor built right into the home button. Gone will be the days of having to repeatedly enter a passcode umpteen times a day. It’s time to upgrade my (now ancient) iPhone 4 for one of these babies. My wife already has designs on the iPhone 4 so it’ll stay around for a while yet.

I just need to work out whether to stay with O2 or move to one of the other carriers. O2 have announced 2 of their tariffs but no word yet from anyone else.

Delving into push notifications

I’ve been programming Apple devices now for nearly 10 years and much has changed in that time. For the last few I’ve been concentrating my efforts on iOS (previously iPhone OS) which is the operating system found on iPhones and iPads. Unfortunately, the Mac side has been left behind (although I hope to get back to that under Mavericks).

Over the last few days I’ve been adding push notifications to an app that I’m in the process of developing for a customer. Probably the biggest surprise for me (and perhaps I’m just being naive here) is that you can’t send out a global push to everyone who has your app – you have to send a separate push message to each of your users.

This means having to store a device token centrally for each person who runs your app and then go through each of these tokens as part of the push. Not too much of a problem if you have a few hundred users, but what about when you get into the thousands?

How many people, I wonder, have downloaded the BBC News app and how much server time must it take whenever the BBC want to send out a news flash to them? I shudder at the thought (as must their IT department).

There’s dedicated method in AppDelegate.m that handles the registering of push notifications:

- (void)application:(UIApplication *)application didRegisterForRemoteNotificationsWithDeviceToken:(NSData *)deviceToken

It appears that this method is called not just when someone first allows your app to receive push messages (via the alert that appears asking for permission) but whenever your app is run. This I find helpful, as I’m using an ASIFormDataRequest to send the device token to a back end server where I check whether or not it exists in my database. If it doesn’t, I add it. If it does I set a most recent accessed date to today.

In my push notification process (based inside a bespoke CMS) I only retrieve device tokens for copies of the app that have “phoned home” in the previous 180 days (that may change). My thinking is by keeping a handle on active users I can avoid sending to people who either no longer use the app, or have removed it from their device. I’m hoping this, in some small way, can reduce the load on the server.

At least until I get as many users as the BBC have got(!)