In October 2018 I released version 2.0 of World of Hex, and this one came with a massive expansion to the game that for some reason I never thought was worth writing about. 🤷♂️
Further to that, in August 2020 I also added macOS support which was far more significant than I had expected it to be at the time.
Now, in 2022, as I embark on another series of enhancements to the game, I thought it might be worth writing a little about the process of adding an entire (well sort of) solar system to the game.
When I first built the game I already had the Moon (or Luna) moving around the Earth at the right time and I was quite happy with that, even if I might not have got some aspects of it right. I had had thoughts of adding a colony to Luna but that wasn’t part of the original plan, so I launched the game with just Earth as a playable colony. It seemed like enough.
But then in 2018 I got the itch to expand, and with Mars being a hot topic I thought it would be neat to not only add a colony to Luna, but one to Mars as well. After all, with everyone wanting to get to Mars (like how many Netflix series are there taking people to Mars?) first I thought it would be a good draw card for the game.
Add to that the fact that I’d been thoroughly enjoying watching The Expanse™ on Netflix, that itch grew and I began to wonder just how hard it would be to “expand” the game to encompass more of the solar system, providing colonies on much more than just Luna and Mars.
Prototyping a solar system
So it was time to start learning how to build a model of our solar system. Google was my greatest initial resource along with Wikipedia, and soon I happened upon a set of data that I could use to model the positions of each of the planets in the solar system.
I started off with a small Swift project just as a prototype to see if I could do what I wanted in SceneKit. The existing app used SceneKit to display the Earth (a sphere) surrounded by a Hexasphere, with another appropriately scaled sphere (Luna) orbiting the Earth.
This prototype adopted a set of Orbital Elements, or Keplerian Elements to place each planetoid (as I called all of the planets and Luna) in the correct position given a date & time. I was thus able to render a basic solar system using SceneKit quite easily.
I was then able to take that prototype, rewrite the key elements of it in Objective-C and integrate it into World of Hex.
That then meant I needed to craft Hexaspheres for both Luna and Mars, that would act as a representation of the colonies on those planetoids. I built these based on my own naive ideas of what the colonies on those planetoids might look like (shape wise). (In hindsight I think the Mars colony is very poorly laid out and ignores the topography of the land entirely 😬)
At this point, the core functionality supporting multiple worlds with colonies worked, and I could have these three colonies but I needed to add a way for the player to move between the colonies.
Enter the solar system view. As I mentioned above I’d been watching and thoroughly enjoying The Expanse™, and one wonderfull thing I saw in a number of episodes was this holographic, interactive rendering of the solar system, showing the trajectories of planetoids (and in this case, spacecraft). Here is one example from Season 1, episode 6:
I wanted something like this. And I wanted, when a player chose to shift focus to a colony on another planetoid (at this time, Earth, Luna or Mars), the camera to be manipulated so that it looked as if the player were in a spacecraft, and that spacecraft would complete a partial orbit, turn to face the destination and then travel there.
This video shows what I was aiming for in this. It’s rough, based on a very early attempt at the animation.
I spent what seems now, a ridiculous amount of time (months) obsessing on getting this animation working. In the end, it beat me. I could not get it to work the way I wanted it to.
What I’d wanted in my I-am-not-a-pilot-or-astronaut position, was to compute the orbit position on the current planetoid that is closest to the destination planetoid, travel in-orbit to that, turning at that point to face the destination, and then travel to the destination, keeping in mind that it is moving in space as the spacecraft moves (so in essence, travelling to where it will be when my spacecraft is due to arrive). To this day, I’m still not sure where I went wrong.
The code is still there inside the current version of World of Hex; it’s just not enabled anymore. Apart from the problems I had getting it to work (so, so many variables), it also became obvious that if the player just wants to move from, say, Earth to Mars, the transition, whilst nice, would probably end up taking too long, and the player would just find it boring.
Below is a snippet of what it looks like when I re-enable that code within the current baseline.
At this point, being heavily inspired by The Expanse™ I started to think about adding more colonies. That also meant adding moons. Then of course in The Expanse there are also colonies on some of the larger asteroids.
This presented a problem for me. The Keplerian elements are great, and the implementation was remarkably easy given some hard work by people much smarter than I am, but you can’t use the same mechanism for some of the moons or for the asteroids. I couldn’t find data for them until I arrived at the realisation, with some advice from my online friends at StackExchange, in this case the Astronomy StackExchange.
I ended up in communication with the kind people at NAIF and one particular person from the Astronomy StackExchange, who gave me some advice and guidance on what I needed to do to get the information I needed from their CSPICE library. This required that I port their library (which itself is a conversion of the original Fortran into C) to compile and be usable on iOS and macOS. macOS was easy, but I found that the library assumed that files that it reads are in the “current directory” which is not going to be the case on an iPhone or iPad. So I went through the process of adding a wrapper to the library, and adding a way to tell CSPICE where to look for any files it needs to access.
This all worked a treat, and I was also able to get their TSPICE library of unit tests running, allowing me to validate what I’d done.
I’ve since been given the OK by the people at NAIF to make the GIT repo’s for both public:
With CSPICE in place and working, my next problem was that the data required to feed into CSPICE to provide the paths of all of the planets, moons and asteroids was bigger than I could reasonably load into memory.
As mentioned in my stack exchange query, I then used the SPKMERGE tool to generate ephemeris files that contained just the data I needed for my app. The app then loads those files via CSPICE and generates the positional data it needs. The data currently built into the app should be good till 2050. If World of Hex is still running by then, then I’m afraid it will probably be someone else updating the data files.
At this point, the solar system managed by World of Hex now comprised (with colonies on those with a ⬢):
Sun
Mercury
Venus
Earth ⬢
Luna ⬢
Mars ⬢
Phobos
Deimos
Jupiter
Io ⬢
Europa ⬢
Ganymede ⬢
Callisto ⬢
Saturn
Titan ⬢
Uranus
Titania
Neptune
Pluto
Ceres ⬢
Pallas ⬢
It’s worth mentioning here that both Phobos and Deimos, moons of Mars move so fast that it became too difficult to keep the camera focused on them, so, apart from the Sun itself, these two planetoids are the only two you can see, but not visit.
Textures and visualisations
Another thing to consider, now that I had positional data, was providing textures for each of the planetoid bodies. For this, initially, I relied heavily on a website called Solar System Scope. (Incidentally, this really is a terrific resource, and it gave me some inspiration at well). From here I was able to obtain high quality textures for most planets, and for those I could not get, I sourced from sos.noaa.gov, or from artists who contributed to the Celestia project. Credits are all provided in the game on the credits screen (if you can find it…)
For the two asteroids, Ceres and Pallas I confess to cheating and just treating them as spheres so that I could use Hexaspheres to represent their colonies.
The Great Conjunction
Back on the 22nd of October, 2020 I was able to record The Great Conjunction, the lining up of Saturn, Jupiter and Earth from within World of Hex, showing that my use and implementation of the SPICE libraries had worked. Here is a short recording I shared with the folk at NAIF:
Scale; the art of fitting an entire solar system onto a screen.
One of the things I had to wrestle with was how to deal with the vast distances that the planets, and especially, the asteroids and Dwarf Planet Pluto travel, and still be able to show more than just a dot for the inner planets.
If you look at most implementations, and Solar System Scope is no exception, it’s impossible to fit it all on a single screen, whether it’s an iPhone or a 42″ TV.
So whilst World of Hex correctly computes everything, the representation is all scaled in post processing of the computed positional data so that:
You can see everything within reason; and
It is still a reasonable representation of the distances.
In addition to scaling the distances, I also took the step to scale the size of the planetoids, and this scaling changes between when you’re in normal play mode, focused on one planetoid, and when you’re in the solar system view mode. When in solar system view mode, the scale of both the distances, and the sizes is dynamically recomputed so as to make the solar system easily navigable.
This gave me an approximation of what I’d seen on The Expanse™ that I was happy with.
Now, if our iPhones or iPads had a 3D holographic projector that could place the solar system all around us as seen in The Expanse™ I’d be looking at how to make that happen…
So what’s going on now?
I’ve written this (rather long and rambling) post because I’ve been working on some changes to the game as part of a push to improve it’s presentation, perhaps open it up to more players, etc. I’ve just finished a 2 month effort to create the first large update since bringing the game to macOS back in August 2020. The update notes for v4.2 are longer than I ever intended them to be.
I initially intended “simply” to add portrait mode to the iPhone. As part of doing that, one thing I’d needed to fix was how I put text on the screen.
When I originally developed World of Hex, the SKLabelNode, part of SpriteKit, did not support Attributed text (text that can change style) so I’d built a way to do this that worked, but not well.
Apple have kindly added native support for Attributed text in the meantime, so I’ve moved to using that, and an entirely different method of creating that Attributed text for display via a wonderful library of code called SwiftRichString. My expectation now is that I will be able to add support for more languages, thus widening the audience of the game.
The full update notes for version 4.2 are listed below:
World of Hex is starting a new phase of enhancement, and to kick this off, the way text is displayed on the screen has been improved, so if you’ve seen a partial message, or been confused about something, perhaps this will help.
On iPhone and iPad, support for Portrait mode has been added, at last! Yes, you can now play World of Hex one-handed on your iPhone or iPad whilst standing on the train commute to work. With this rather large change a number of related bugs have been fixed that have been plaguing players for a while now.
New users are now introduced to their AI module right from the get go, so that they know there is more to the game to build towards. Did you know that you can program your own AI with commands to defend the world tiles you win? No? Well all you need to do is reach level 2, and access will be granted!
The leader board now displays a more information about how well those players at the top have been doing.
For those of you that have noticed that the Earth seemed to get confused about just where the Sun actually is, and turned the wrong face to the Sun, I believe I have finally fixed this. It won’t affect play at all, but it affects my sanity. So this way I get some peace of mind too.
Finally, finally, World of Hex no longer stops your favourite music from playing when you launch the game! I’m sorry this one has taken so long.
Panning a world on macOS now works as nicely as it does on iOS, iPadOS and tvOS. And it’s even nicer on those too!
The player information panel now shows a small meter to indicate to you how much more you need to play before reaching the next level.
Experience points (XP) can be earned faster now with the addition of a “win multiplier” for each world tile. If you, or someone from the same faction keeps winning consecutive games in a given world tile, then XP are earned much faster.
On macOS, panning to rotate the selected planet or moon using two fingers now works as smooth as silk (finally).
Fixed a number of nasty memory leaks and buggos.
Fixed a problem where the colour of a tile was not changing after you win a game. This bug was introduced back when I added the Game Center Achievements. My apologies; it’s fixed again.
A note for the wary. Apparently, if you force kill World of Hex, iCloud can get it’s knickers in a knot and stop sending the app the background notifications that allows it to keep the state of the world tiles up to date. If you’ve done this, and think the tiles are not accurate, then the only way to fix it (apparently) is to reboot the phone. Not my preferred advice to anyone but it’s all I got. I spent quite some time trying to work out if I’d done something wrong, but no…
On the Apple TV, something special, well I think so. You can now zoom in and out in little bits.
And where to next?
Now that v4.2 is out there, the current roadmap of things I want to do are:
Allow it to be played offline (which is needed for an Arcade title I understand).
Add portrait mode on iPhone.
Add localisations to the game to make it more accessible to non-English speaking players.
Add accessibility to the game.
Add controller support, especially for tvOS.
Add more visual polish and eye-candy.
Add more moons and asteroids.
For fans of The Expanse™ add a pocket universe and allow the Ring Gate to be activated.
Add an Easter egg or two. There is already one (the credits scene) if you can find it.
I call this a roadmap which implies some sort of order. As you can see, the order means little. Some items are easy enough to do, though cost money (and believe me, this game does NOT pay it’s way) because I need to pay others for things like translations.
If you’ve made it this far, thanks for reading. I hope it wasn’t too hard to follow.
Back in the 1980’s, when I used to spend way too much time playing games on my Apple IIGS (and earlier, my Apple IIe), one of my favourite games was Fortress, by SSI.
Fortress gave me a small game board where I would fight it out against one of several computer AI’s, where a game consisted of 21 turns, and whoever controlled most of the game board at the end was the winner.
One of the things I loved about Fortress was the way the AI’s got smarter with time. When you first started playing, it was easy to win, but after a few games, it became more challenging. This kept me coming back to Fortress as I felt I was playing against something that basically learnt as I did.
As a programmer/developer, my mind is rarely idle, and I always have a project on the go. In the 1994 I thought it would be neat to rewrite Fortress for the Apple IIGS, using higher resolution graphics.
I started doing this with the ORCA/Modula-2, which I had recently brought to the Apple IIGS with publishing help from The Byte Works and some connections at Apple.
As part of writing this blog post, I’ve run up my Apple IIGS environment (yes, I still have all of it) within the wonderful Sweet16 emulator and found that code:
I hadn’t realised just how much of the game I had written. I thought I’d only written a bit of the game logic, however it turns out I’d written a lot of the UI as well, as can be seen from when I ran it. The AI’s hadn’t been written but the basic building blocks were there.
The funny thing is, I have the code; I have a compiled binary that I can run, but I can’t remember how to re-compile the source code anymore. I’ve got a build script there, but my memory fails to help me out.
One of these days I should bring all that code out, and store it somewhere safer.
Around this time, I got distracted and much of my home based projects took a back seat, Fortress included. My work took me away from Apple development entirely for around 15 years.
So Fortress GS was left on a floppy disk (or two) in a box of backup floppies along with everything else.
Then, in 2012, after I’d been back developing for Apple hardware again for a few years I got the bug again, and, having recovered my entire Apple IIGS development environment from hundreds of floppies and some second hand SCSI drives (my how they’ve grown; did you notice the size of the “M2” hard drive above?), I was able revisit Fortress GS.
I ported the guts of the code to Objective-C and wrote a basic prototype to show to another developer at the time as a proof of concept. This one was really basic, but it allowed me to place moves for both sides by tapping the screen.
I showed this to a designer I knew at the time who thought the idea was great, but suggested that it would be more interesting with a hexagonal grid rather than the rectangular one.
I toyed with the idea at the time, but I did nothing with it; I had other projects happening, and I wanted to focus on my educational apps.
Moving up to 2016, and the release of the Apple TV, I launched my latest educational app, Tap Tangram (which I later launched as Classroom Math Drills), and due in part to my failure to recognise that I’d missed my target, and the complete lack of featuring by Apple, the app never gained any traction and failed at launch.
That left me wondering what to do next, and then it occurred to me to reboot the Fortress app idea once again. I’d also recently read a most-excellent blog article by @redblobgames about manipulating hex grids in software, so my mind was abuzz with what I could do with it.
Enter World of Hex, my latest, and final attempt to reimagine the classic Fortress for iOS and the Apple TV.
I started out just playing with the hexagonal grids code that I wrote as a port of the code provided by @redblobgames and getting the basic board working with the underlying move computations.
Once I’d done that, I sat down and brainstormed how I wanted the app to work; how the game would play and during this process, I asked myself:
“What if, rather than a simple rectangular grid of cells, we had a map of the world as a map of hexes?”
And then I got going.
“What if, the terrain was somehow represented in this 2D map of hexes. Rather than try to represent the 3rd dimension as a true 3rd dimension, colour the hexes to represent the terrain.”
and
“Hmm. how many cells?”
“Earths land surface area: 150,000,000 km2”
“If we say each hex has a real world “size” of 1km, then we need to be able to map out 150 million hexes eventually. Even if they aren’t all being used by players, we need a way to know where on the earth a hex maps to land.”
“So, what is probably easier, is to map the entire planet with hexes, and then mark some as usable land, and others as ocean, unusable land, etc. that means a lot more hexes in the database though. It means millions of hexes to cover the planet completely. too many.”
“Will performance be an issue? yes.”
And so it went; with performance an issue and no real idea at that point of how to make it all happen I went hunting for others that had build a world of hexes. I needed to get an idea of:
Could I get the basic mechanism to work on an iPhone
How many hex tiles would I need to build a reasonable approximation of the Earths land areas?
How would it perform if I built a model with all those tiles?
After some searching with Google, I happened upon the wonderful Hexasphere.js by Rob Scanlon. This gave me hope. If this could be done in a browser, then I could do it.
So I set about to port his Hexasphere javascript code to Objective-C to see what I could achieve.
This is where I started to hit upon the boundaries of my knowledge of 3D modelling and SceneKit. I also found myself struggling with some of the math concepts involved, having to trust in these people that obviously handle it better than I.
I did get Hexasphere working, though it was extremely slow because every hexagonal tile was being implemented as a separate SceneKit node. It did work, but it just wasn’t going to cut it for a production quality game. At this point I was using very large hexagonal tiles, so the tie count was still quite low. Once I increased the resolution of the model, there would be a lot more.
I ended up posting a question or two on the Apple developer forums and the Games Stack Exchange. These helped me better understand how to improve the performance of my 3D model however I was still hitting problems in that the on-screen representation of the Hexasphere was not high enough quality.
I spent several weeks working on it and getting some great help from colleagues who knew math, and 3D rendering far better than I. The end result of that was a perfectly rendered Hexasphere using only 4 SceneKit nodes that rendered at a full 60fps on devices as old as the iPad2. The change was to put all of those tiles into a single model, and to colour them individually via the shader and it’s inputs.
I finally had what I needed to get on with the game.
At this point it was just a matter of bringing all of the pieces of the puzzle together and making them work well.
For this game, the main pieces were:
The hexasphere code
The Hex Grid code
SceneKit and SpriteKit
CloudKit (iCloud based database)
I’ve already spent enough time on the hexasphere and hex grid, so I’ll try to restrict the rest of this post to the hurdles I had finishing off the app and bringing it all together.
SceneKit and SpriteKit
Apple’s engineers have done a wonderful job of these two API’s. Having developed most of my apps with Cocos2D, the transition to SpriteKit and SceneKit was pretty painless. The primary difference for me was the coordinate system.
The main reasons I went with Apple’s frameworks this time were:
I wanted to be able to render the 3D world, which Cocos2D wouldn’t do.
I also wanted to branch out and learn something new.
That said, the trick was that I needed to be able to overlay my 2D game components on top of the 3D components. After a little research I discovered that Apple had kindly given us an “easy” way to do this via the overlaySKScene property of the SCNView class.
This works remarkably well however it does introduce some problems because there are bugs in the Apple frameworks (at least, there are at the time I write this). I found that there are some things, like animations of the SpriteKit nodes that need to be forced to be done within the SceneKit renderer thread. It seems that Apple use a multi-threaded renderer for SceneKit/SpriteKit and some operations that you’d expect to be thread safe, aren’t.
With a lot of help from Apple Developer Technical Support, I found and fixed this problem and filed a bug report #32015449 (github project) accordingly.
Another issue related directly to the use of overlaySKSCene was an incompatibility with the tvOS focus engine (it basically doesn’t work). I ended up having to port a focus engine I’d written for Cocos2D on tvOS and enhance it to work with World of Hex. I’ve also filed a bug report for this issue: #30628989 (github project).
Apart from this, SceneKit and SpriteKit work a treat and have made my life so much easier.
CloudKit and iCloud Integration
Once I’d decided to expand the original game beyond a single game board, and to allow people to play games in a world of game boards I needed a way to store the game boards in the cloud so that everyone sees the same thing.
When I started to develop this idea my family and I were enjoying Pokemon GO for the novelty it provided. As a user, one of the things I really didn’t like about Pokemon GO was the way it forced users to either associate our existing Google account with the app, or to create a brand new Google account just for the game. There were other options, but they all involved forcing the user to log into a specific account, just for the game.
So I looked at Apple’s CloudKit which is just one part of the whole iCloud service layer that Apple has been building and developing for years now. One of the beauties of CloudKit is that for every person using an iPhone or iPad that is logged into iCloud, an app integrating CloudKit will just work because there’s no explicit login required.
This is what I wanted. On the whole, the CloudKit integration was very straight forward and it does just work.
I really enjoyed the ease with which Apple have allowed us to define our database structure via the CloudKit dashboard, make changes and even migrate those changes from development to production environments painlessly.
If there is one thing that I found lacking it is that in the dashboard, there is no way to simply remove all existing data without also wiping the database model itself.
Conclusion
World of Hex has grown far beyond what I originally set out to write. It’s nothing like my original attempt back in 1994 on the Apple IIGS, and even my really early brainstorming of last year differs somewhat from what I’ve built.
One of the reasons I build these apps is for the challenge and to keep my active mind busy. I certainly don’t make much of an income from them (though, mind you, I wouldn’t complain), so there’s a lot of satisfaction in having an idea realised and released into the world. Yes it can be crushing when it doesn’t take off, but, as I mention in the credits scene within World of Hex (can you find it?), “Never Give Up”.
Learning some of the quirks of Apple’s frameworks has certainly been a challenge. Cocos2D has been wonderful to work with over the years, and in some ways it’s more mature and easier to work with than SpriteKit, however SpriteKit’s deep integration is hard to pass up now that I’ve learnt it.
SceneKit offers some pretty amazing functionality from my point of view. I remember, as a teenager back in the early 80’s having a book with some algorithms for 3D line art animation that blew me away at the time. Being able to draw a model in your fave modelling tool, drop it into Xcode and have it on a device screen in minutes is insanely great. For developers out there that think its tough work creating an app, you have no idea how spoilt you are.
If you’ve read through all this, then thanks for staying till the end. It grew somewhat longer than I’d planned.
Here it is, my World of Hex. I hope you take the time to have a game, and that you enjoy it.
I’ve been working away on my latest app, and was just creating a new piece of artwork for the splash screen. When I did this, I wanted to start with the iPad Pro 12″ and scale down within Photoshop to maximise the quality of each asset size.
For all of my other assets, I had started with the iPad for some stupid reason and got my math all confused.
So I went hunting for a guide, and found an old site from Ben Lew of Pi’ikea St. It was a little out of date, plus I really wanted to calculate the sizes using the iPad Pro as the starting point.
So I took Ben’s page and popped it into a spreadsheet. The result is available below for download. I’ve also taken a screenshot so that you can see it easily.
In Photoshop, something Ben taught me to do a few years back was to create a layer called “default”, and, in order to get Photoshop to export the various layers in my file as appropriately sized assets, add the sizes as percentages along with folder names.
For me, assuming my originals are for the iPad Pro 12″, this means I give my ‘default’ layer the name:
So, I’ve been working on the tvOS port of Tap Tangram (due out, March 17!) and have a few observations. As some know, I like to use Cocos2D for my apps; it gives me a huge degree of flexibility for building the UI and doing what I want in the app.
One of the things I’ve been doing in apps for the past year or so is providing a lot of configuration options for the player, and what I’m finding now is that the UI I typically build for this on iOS just doesn’t work on tvOS.
Tap Tangram Player Editor
For one thing, the focus engine on tvOS does not like to play with non UIKit buttons and so on. If I have a name field which tends to use UIKit under the covers, I end up with a single UIKit object on the screen, and the rest are my own Cocos2D buttons and switches.
Now, back before the new Apple TV went on sale, I put a lot of time into producing a focus engine for Cocos2D that mimics Apple’s engine (you’ll find it in the tvOS branch of Cocos2D v2.2 here). It works really well, and I’ve used it in both GALACTOBALL and Tap Times Tables. It’s not quite as clever as Apple’s one, but it works quite well and has a flexible API.
I’ve updated this API to work with the latest version of Cocos2D, and have been integrating it into Tap Tangram, however on this player editor, tvOS won’t play nicely because it wants you to do everything it’s way.
The end result is that the UITextField is given focus by tvOS even when I don’t want it to. Apple, for reasons of their own have made it really difficult to control the focus engine in our user interfaces. It’s all UIKit, or no UIKit, unless you can find some tricky workaround.
In this instance I have not been able to find a work around that is satisfying. It feels clumsy.
So what to do?
Write a brand new UIKit Player Editor, that’s what!
After mulling over my nice UI and wondering how to squeeze that tvOS square peg into my Cocos2D round hole I realised that even if I got it to work, my UI just didn’t make as much sense on the TV. I look at those switches, and I want to flick them. I look at the slider and I want to slide it. On tvOS, this just doesn’t make sense because it’s not a touch interface even if you are using the Siri Remote.
So I decided to start from scratch, and write a basic UIKit UI for the player editor.
As soon as I started to lay it out I discovered that on tvOS, some of the user interface features we know and love, are missing. There is no UISlider. There is no UISwitch. How was I supposed to put a toggle switch on screen if Apple haven’t given us one? I took a look at the Settings app on the TV. Pretty much everything is done via tables. Toggles are simple table cells that when clicked, toggle their state.
I can do that for all those switches, but what about the slider? Well, at the moment, it looks like I will have to implement this as a cascading picker so that when the user clicks on “Maximum Value” it will change to a simple picker. It means less flexibility for the user, but ease of use.
The up shot of doing it this way is that I no longer have to worry about the focus engine because tvOS will do everything for me. The down side is that I’m going to have this screen (or two) that although very functional and easy to use, it will not look in any way consistent with the rest of the app.
In summary…
So, either way I have to make compromises. Do I stick with my own look and feel, and find a way to make it work, or do I take the “easy” path, use UIKit and accept that it just won’t look as nice (in my opinion)?
I’ll continue to experiment as I move forward. Unfortunately, the main game screen of Tap Tangram is a really really complicated combination of scrolling areas, buttons, and tangram pieces that can flip, rotate and be moved. I can’t take the UIKit approach there, so whatever I do on the Player Editor screen, I’m still in for some fun.
I’m about to embark on a collaboration with another developer. We want to create something new and fun. One of the first things to crop up is the tools that we use. In the interests of documenting what I use, I thought I’d write it as a blog post for all.
One of the amazing things about software development is that we developers can be very passionate about what we use, and how we use it. Some developers love getting their hands dirty by doing all the hard stuff themselves. Some like the ease of point-and-click programming (and there are some of that wouldn’t call that programming, but we’re probably being snobbish).
Me? I’ve been around long enough now to have got my hands dirty on a whole bunch of things over the years. I started out with AppleSoft Basic on an Apple IIe, and progressed through a whole suite of tools and languages until the Apple IIGS was discontinued in the mid-nineties. I could go on about those days and the years between then and the current “App” development wave, but that’s not what this post is about (if you want to hear more about the “good old days”, then let me know via comments; if there are enough then perhaps I’ll take a stroll down memory lane).
I won’t attempt to compare what I use against what others use here; this is simply a write-up of what I use, and briefly, why.
I would like to point out though that this post is probably best for other developers, or budding developers. I will use terms and jargon here and there that won’t mean much to non-developers.
Programming
Perhaps it’s something to do with my age and where I’ve come from, but I like coding by hand. Don’t get me wrong, I’m happy for an IDE (Integrated Desktop Environment) to do some simple stuff for me, but for a lot of it, I’m more than happy to type things out from scratch. The act of typing in code, even what might be template code to others, connects me with what I’m doing; it’s an opportunity to construct the tapestry as I work, to think as I type. Having a lot of it done for me means that typically, I’m allowing the tool to dictate limits and sometimes, it’s own design, on what I am creating. Coding by hand means that the limits are my own.
For iOS App development, my coding environment of choice is Apple’s Xcode. This is a terrific, free, IDE that comes with absolutely everything a developer needs to code an app and submit it to the App Store. Now I say everything, and it’s true, but in reality there are things like images, icons, sounds, documents, etc that also help to make up an app, and creation of those falls to other tools.
Apple has traditionally encouraged the use of Objective-C for all development. Most developers either love or hate Objective-C. I actually enjoy it as a language. When Apple introduced Swift in 2014, I was a bit surprised. I knew that lots of people don’t like Objective-C, but I didn’t think people would be so happy to see a replacement. Swift has so far managed to fail to capture my attention. I have no desire thus far to change; Objective-C works, and works well.
App User Interface
Apple, for the iOS environment pushes it’s own UIKit and I’ve used this in several of my apps. It works well, and it’s very powerful. I don’t however like to use UIKit for the educational apps, and games that I create. For these I have used Cocos2D. I’ve been using Cocos2D since it was version 0.99. It’s now up to version 3.3. Most of my educational apps use version 2.2 of Cocos2D, though future apps will most likely use version 3.3 or later.
This GIT repository contains an entire Cocos2D project that I put together to demonstrate the use of Apple’s Accessibility API in conjunction with Cocos2D.
Going forward, Cocos2D is now integrated into a new IDE called SpriteBuilder, which is freely available at: http://www.spritebuilder.com, SpriteBuilder provides a very powerful environment that allows you to design the UI of your app in a way that can be built for both iOS and Android. I have yet to test/try the Android side of things, but feedback from other developers who have used Apportable which has been integrated into SpriteBuilder, has been very encouraging.
SpriteBuilder creates, from your design, and entire Xcode project that you can then add code to, and build for submission to Apple. They integrate well, and the powerful thing is that once you open up the Xcode project, you can forget about SpriteBuilder if you choose and hand code the rest of the app. I really is, for me, a good blend of the two.
If I’m building a UIKit app, then I do everything in Xcode.
Code Management
When I first started working with Xcode, I used “cvs” to manage and control the various versions of my code. It worked well, but in the years since then, the development world has moved on. These days, the trendy choice is “git”, and for me, it’s a good choice. It works well in a local environment, and I’m able to set up remote environments so that I can easily backup my code to a file server, or the cloud.
For code version control using “git” I used SourceTree, by Atlasssian. SourceTree is available for free from: sourcetreeapp.com It’s a great tool, very powerful and integrates beautifully with a cloud based service called BitBucket, also by Atlasssian. I use BitBucket because I can create unlimited private repositories for free, and it’s very handy for sharing code with other team members. I use SourceTree on my Mac to manage daily commits of code, and then push those commits to the cloud or a file server periodically so that I have backups.
Artwork
For a lot of my apps, I’ve created most of my own artwork. Until late last year I did all of this using a free app called “GIMP“. It’s a great tool, and for people, like myself, who work on a very low budget, it works well. It’s cross platform and there’s even a version on iOS called ArtStudio (though they don’t call it GIMP, when you look at it’s feature set, and menu structure, I’m convinced that’s it’s built from a GIMP codebase).
With the new version of Money Up, I moved to the Adobe Creative Cloud suite, and Photoshop. Whilst I enjoyed GIMP and became proficient using it, I now really enjoy the power provided by Photoshop and the higher quality outputs achieved by using Vector based shapes for the drawings. GIMP is a raster based editor, and as such, is unable to export cleanly scaled images in the same way.
Sound and Music
Before I mention the tools I use on my Mac to edit sounds and music I want to mention two websites I use to source most of my sounds and music:
incompetech.com – This is a wonderful site by Kevin MacLeod who shares a vast library of his original royalty-free music. I’ve used a number of pieces from this site; the ability to browse using a terrific filter makes life much simpler.
freesound.org – This site is a powerhouse, full of sound recordings. Be careful to observe the licenses attached to individual recordings.
For most of my sound editing, I use Audacity, a free and yet, very powerful sound editor. When I needed to clean a large number of sound recordings for Money Up however, I used Audition CC, part of the Adobe Creative Cloud suite. For me, it’s not as intuitive as Audacity however it’s very capable, and some things are easier to do.
I always export my sounds as “AIFF” files, but I don’t use those within the apps that are submitted to Apple. Most apps don’t need high quality sounds, especially for simple sound effects. What I do, is run a short script over all of my “AIFF” files, to convert them to “CAF” files which take up less space, but still sound just fine on an iOS device.
This script comprises:
#/bin/sh
FILES=`find . -name \*.aiff`
for F in $FILES;
do
DF=`basename -s .aiff ${F}`
echo "Converting ${F} to ${DF}.caf"
It’s been a busy couple of months since Apple released iOS 8, the new iPhone 6 and 6+, Yosemite and … hold on, this post is supposed to be short. If I list everything that Apple announced, I’ll be here all day!
On with it. Since Apple’s big event, I, like most other active app developers, have been very, very busy. I’ve been working away at a new app for special needs education, and a complete rework of my first special needs app, Dollar Up. I’ve also had to update all of my apps to be sure that they work on iOS 8.
As if that wasn’t enough, I’ve also written another new app called 9 Letters which is due to launch this Thursday, the 20th of November. 9 Letters is all about letting the inner word searcher go mad. If you like Scrabble™ or Boggle™, or just about any other word game, I think you’ll like 9 Letters.
One of the really neat features of iOS 8 and Yosemite is Continuity. I love it; it brings to the Apple devices a wonderful synergy where they work together to become a single powerful tool that anyone can use. No more do we have to close a document, save it somewhere special and then reopen it on another device in order to continue our work. We can now just Handoff the document, in it’s current state, from one device to another.
When I was nearing completion of 9 Letters I realised that the game would benefit from this neat feature in iOS 8, so with a remarkably small effort (Apple really made the process very easy) I added Handoff to 9 Letters, and I love it.
Now, when I am on the train home from my day job I can play a game of 9 Letters on my iPhone, and then handoff the game to my iPad when I get home. It really is great.
Below is a short video I recorded tonight that tries to demonstrate just how great this is. I hope you enjoy it. I also hope that other developers get behind Apple with Continuity and all it offers; working with computers and mobile just got even easier.