I’ve been working away on my latest app, and was just creating a new piece of artwork for the splash screen. When I did this, I wanted to start with the iPad Pro 12″ and scale down within Photoshop to maximise the quality of each asset size.
For all of my other assets, I had started with the iPad for some stupid reason and got my math all confused.
So I went hunting for a guide, and found an old site from Ben Lew of Pi’ikea St. It was a little out of date, plus I really wanted to calculate the sizes using the iPad Pro as the starting point.
So I took Ben’s page and popped it into a spreadsheet. The result is available below for download. I’ve also taken a screenshot so that you can see it easily.
In Photoshop, something Ben taught me to do a few years back was to create a layer called “default”, and, in order to get Photoshop to export the various layers in my file as appropriately sized assets, add the sizes as percentages along with folder names.
For me, assuming my originals are for the iPad Pro 12″, this means I give my ‘default’ layer the name:
As some may be aware, the Parse service is to be shutdown on the 28th of January. Parse gave developers 12 months to sort ourselves out and find another place to host our data, and drive our services.
I’ve been using Parse for a couple of years now, to provide a push notification service to my uAlertMe app. I was looking at removing the app from the app store, and discontinuing support, because I couldn’t find a cost effective way to keep it all running. uAlertMe is an app that sells perhaps 200 copies a year, so there’s not enough income to cover monthly service fees.
Then, late in 2016 I saw a message on one of the local developer groups that Buddy had established a relationship with Parse, and were providing a wonderful migration tool to allow us, in a relatively pain free manner, take our data from the existing Parse service, import it into Buddy, and (hopefully) sit back.
Now I’d have to say that it wan’t quite that easy. Because I jumped on board pretty quickly, and because I wanted to use the Push notification system, I was wanting to use features that hadn’t been completely polished.
So, with some really terrific support from the kind folks at Buddy, I worked on getting everything working, and as of today, iAlertU and uAlertMe are happily talking via Parse on Buddy.
So, if you still haven’t migrated your Parse data, and are wondering what to do, you have 21 days (as I write this) remaining. Get on over to Buddy and get the process started.
Sort of late posting this here (it’s been on FB and Twitter for a week now).
I’ve discounted most of my educational apps till the end of the year. This is a great opportunity to pick up some great educational apps for those iPad and iPhone gifts:
Please like/share and if you purchase, please rate/review the app.
You know, I think Apple and their incredible focus on accessibility is amazing. My daughter and I are currently waiting for Apple to review our first sticker app, called Heroes and Villains Stickers.
My daughter put all the artwork together, all drawn on her iPad using the ArtStudio app, and I did the “programming” bit (though programming is a stretch given Apple have made it so easy to put a sticker app together). This project is a labour of love for her, as she’s a real fan of the characters she’s drawn.
As we put it all together I realised that all those stickers had names, and that those names actually have function and meaning as you put the sticker app together.
Sticker Properties
When all was ready to submit, I got my daughter to go through all of them and name them with text that reflected the meaning of the stickers.
That has paid off in the final product as with voiceover turned on, tapping on a sticker causes the iPhone to read the name of the sticker out loud.
This is terrific as people with impaired vision can enjoy the stickers too, and send them to people that can see them, knowing that the sticker means the right thing.
It is details like this that make it easy for me to continue working with Apple and their ecosystem as I know that when Apple talk about inclusivity, they mean it.
I’ve been waiting to see what Apple would announce this week. I’ve been hoping for an excuse to upgrade my 2013 MacBook Pro (with retina), partly because I’d like some more storage (It’s amazing how quickly a 256GB drive gets filled once you start developing apps), and because my eldest is about to start University next year and I thought upgrading would mean I could give her my more than capable existing MacBook Pro.
At the moment, she’s working with my original MacBook, a 2009 white unibody MacBook which, as can be seen, has had better days. It still works a charm, though it’s been running hot, with all fans howling for a year or two now.
So, like everyone else working with a MacBook (Pro), I was keenly awaiting the refresh of the line this week. The rumour mills are pretty accurate these days, so we already had a fair idea of what to expect as a minimum, however to be honest, I’d been hoping for more.
I’d also been hoping for a realistic move forward from where I am now, and I really don’t think Apple have provided this.
Dongles Everywhere, it’s the future man…
Last year I saw the new 12 inch MacBook released with it’s single USB-C port and quickly wrote the device off as a waste of time. The lack of a separate power connector, and specifically, a MagSafe power connector was to me a huge step backwards. The smaller screen size made it doubly less attractive.
When I think about buying a new computer, I like to think I’m buying something that will allow me to continue from where I am, and transition over the next few years to a point where I’ll be ready for the next transition.
With the 12″ MacBook, and these new MacBook Pros, this is not the case. If I were to upgrade to one of the new MacBook Pro units, none, repeat, none of my existing peripheral hardware would be able to connect to it without these stupid, ugly white dongles.
Here is an image from dailytech.com showing the mess Apple is moving us towards (this for a 12″ MacBook, but you get the idea):
Every day, I connect to my MacBook Pro via the standard USB connector, various iPhone or iPad devices, other devices to charge, external drives for backup, and so on. Other people have more than I do.
If I were to ‘upgrade’ to one of the new MacBook Pros, I’d have to find a way to connect all of these devices via a USB-C port. What’s more I’d need to purchase a number of these ugly white dongles (why can’t I get them in a colour to match the MacBook I’ve just hypothetically bought?) if I want more than one plugged in at a time. Apple ‘kindly’ gave us 4 USB-C ports on these new MacBooks but that just encourages us to purchase more dongles.
Pricing
Note: OK, I’ll be quoting Australian dollar amounts here, but they should be indicative of other markets.
Here in Australia, the starting prices (it’s not even worth looking at the spec’d up prices, really) for each of the 3 new MacBook pros are as follows:
13″ MacBook Pro (no Touch Bar): $2199
13″ MacBook Pro (Touch Bar): $2699
15″ MacBook Pro (Touch Bar): $2999
Apart from the Touch Bar, there are some other subtle (or not so subtle) differences, the main ones being the speed of the CPU and the amount of storage.
Now, like many of you, and certainly, many iOS developers who, contrary to popular myth, aren’t living the high life off app sales, those prices just aren’t viable to me. My current MacBook Pro was bought as a refurbished unit from Apple, and it’s probably the way I’ll go next time now that I’ve seen these prices.
Apple are basically saying to me, “Hey Peter, we understand you can’t afford our sparkly new Touch Bar MacBooks, but you can always buy the new MacBook Pro without the Touch Bar. It’s only $2199!“.
My answer to this is that for that amount, I’d be buying a MacBook with the same amount of storage, a slower CPU, less connectivity, less expandability (I currently have a 128GB SD card in the SD card slot to expand my storage economically), and for more than I’d pay for a newly refurbished 2015 MacBook Pro with twice as much storage (see image to the right).
No thanks Apple.
I love your hardware, and I really enjoy the ecosystem you have created. My family is well and truly committed to Apple tech too, but this time around, we’ll be avoiding your new MacBook pros. If there is a new MacBook to be bought, it will be one that we can use now with the peripherals we have now.
So, I’ve been working on the tvOS port of Tap Tangram (due out, March 17!) and have a few observations. As some know, I like to use Cocos2D for my apps; it gives me a huge degree of flexibility for building the UI and doing what I want in the app.
One of the things I’ve been doing in apps for the past year or so is providing a lot of configuration options for the player, and what I’m finding now is that the UI I typically build for this on iOS just doesn’t work on tvOS.
Tap Tangram Player Editor
For one thing, the focus engine on tvOS does not like to play with non UIKit buttons and so on. If I have a name field which tends to use UIKit under the covers, I end up with a single UIKit object on the screen, and the rest are my own Cocos2D buttons and switches.
Now, back before the new Apple TV went on sale, I put a lot of time into producing a focus engine for Cocos2D that mimics Apple’s engine (you’ll find it in the tvOS branch of Cocos2D v2.2 here). It works really well, and I’ve used it in both GALACTOBALL and Tap Times Tables. It’s not quite as clever as Apple’s one, but it works quite well and has a flexible API.
I’ve updated this API to work with the latest version of Cocos2D, and have been integrating it into Tap Tangram, however on this player editor, tvOS won’t play nicely because it wants you to do everything it’s way.
The end result is that the UITextField is given focus by tvOS even when I don’t want it to. Apple, for reasons of their own have made it really difficult to control the focus engine in our user interfaces. It’s all UIKit, or no UIKit, unless you can find some tricky workaround.
In this instance I have not been able to find a work around that is satisfying. It feels clumsy.
So what to do?
Write a brand new UIKit Player Editor, that’s what!
After mulling over my nice UI and wondering how to squeeze that tvOS square peg into my Cocos2D round hole I realised that even if I got it to work, my UI just didn’t make as much sense on the TV. I look at those switches, and I want to flick them. I look at the slider and I want to slide it. On tvOS, this just doesn’t make sense because it’s not a touch interface even if you are using the Siri Remote.
So I decided to start from scratch, and write a basic UIKit UI for the player editor.
As soon as I started to lay it out I discovered that on tvOS, some of the user interface features we know and love, are missing. There is no UISlider. There is no UISwitch. How was I supposed to put a toggle switch on screen if Apple haven’t given us one? I took a look at the Settings app on the TV. Pretty much everything is done via tables. Toggles are simple table cells that when clicked, toggle their state.
I can do that for all those switches, but what about the slider? Well, at the moment, it looks like I will have to implement this as a cascading picker so that when the user clicks on “Maximum Value” it will change to a simple picker. It means less flexibility for the user, but ease of use.
The up shot of doing it this way is that I no longer have to worry about the focus engine because tvOS will do everything for me. The down side is that I’m going to have this screen (or two) that although very functional and easy to use, it will not look in any way consistent with the rest of the app.
In summary…
So, either way I have to make compromises. Do I stick with my own look and feel, and find a way to make it work, or do I take the “easy” path, use UIKit and accept that it just won’t look as nice (in my opinion)?
I’ll continue to experiment as I move forward. Unfortunately, the main game screen of Tap Tangram is a really really complicated combination of scrolling areas, buttons, and tangram pieces that can flip, rotate and be moved. I can’t take the UIKit approach there, so whatever I do on the Player Editor screen, I’m still in for some fun.
Today, after what has seemed like an eternity of waiting (it wasn’t but it seemed like it), Apple kindly approved the latest update to “Dollar Up”. The app is now called “Money Up” and it has taken on a whole new look.
With the invaluable help of Ben Lew of Pi’ikea St, the app has been completely revamped. The user experience is smoother, and to top it all off, now sports 5 wonderful, friendly characters:
From left to right, they are: Grace, Sparkles, Ted, Zilla and Pirate Joe.
For a brief preview of the new Money Up, watch this video:
For those people that have purchased “One More” in the past and would like to enjoy the new Money Up app, I’m willing to offer a free promo code for Money Up if you can provide me with proof of purchase.
One More was originally created in an attempt to make Dollar Up more “searchable” for people in the UK that might have been put off by the icon and the word “Dollar”. Now that v2.0 has been released, and the emphasis on the “Dollar” has been taken away I’ve elected to discontinue “One More”.
As I said above, if you’ve previously purchased One More, then please send me proof of purchase (either a receipt from Apple, or a screenshot of the main menu) via email to support [at] pkclsoft.com and I will send you back promo code for Money Up.
It’s been a busy couple of months since Apple released iOS 8, the new iPhone 6 and 6+, Yosemite and … hold on, this post is supposed to be short. If I list everything that Apple announced, I’ll be here all day!
On with it. Since Apple’s big event, I, like most other active app developers, have been very, very busy. I’ve been working away at a new app for special needs education, and a complete rework of my first special needs app, Dollar Up. I’ve also had to update all of my apps to be sure that they work on iOS 8.
As if that wasn’t enough, I’ve also written another new app called 9 Letters which is due to launch this Thursday, the 20th of November. 9 Letters is all about letting the inner word searcher go mad. If you like Scrabble™ or Boggle™, or just about any other word game, I think you’ll like 9 Letters.
One of the really neat features of iOS 8 and Yosemite is Continuity. I love it; it brings to the Apple devices a wonderful synergy where they work together to become a single powerful tool that anyone can use. No more do we have to close a document, save it somewhere special and then reopen it on another device in order to continue our work. We can now just Handoff the document, in it’s current state, from one device to another.
When I was nearing completion of 9 Letters I realised that the game would benefit from this neat feature in iOS 8, so with a remarkably small effort (Apple really made the process very easy) I added Handoff to 9 Letters, and I love it.
Now, when I am on the train home from my day job I can play a game of 9 Letters on my iPhone, and then handoff the game to my iPad when I get home. It really is great.
Below is a short video I recorded tonight that tries to demonstrate just how great this is. I hope you enjoy it. I also hope that other developers get behind Apple with Continuity and all it offers; working with computers and mobile just got even easier.
With the advent of iOS 8, Apple has added what it calls “App Previews” to the App Store so that developers can showcase their apps with an up to 30 second video of their app.
These videos are supposed to be simple screen captures, with perhaps a little post editing done in an application like iMovie. They are not supposed to be anything more than that. With that in mind, Apple have made it pretty easy to record your App Preview and upload it to iTunes Connect.
Even so, it seems that some people are still struggling with what tools to use, and how to get a file that they can upload that meets Apple’s rules.
The main things you need to take into consideration are:
The video must be less than 30 seconds in duration.
The video must be less than 500mb in size
The video must be the correct resolution. For this, the resolution depends on the device class you are submitting the video for. When I submitted my videos, the iPhone 6 had not been released so I had only to create videos for the iPad and iPhone 5. Apple don’t allow us to create App Previews for the iPhone 4.For the iPad, videos must be 1200 x 900 pixels
For the iPhone 5, videos must be 1136 x 640 pixels
Before you start – ingredients
For any act of creativity, be it cooking, or recording an App Preview, you need the right ingredients. For me, I use:
Mac Mini running the latest OS X Yosemite.
iPhone 5 or iPad Air.
Lightning to USB cable.
Quicktime Player application (part of Yosemite) for video capture.
iMovie 10.0.5 for video editing.
Handbrake v0.9.9 x86_64 for final video encoding and cropping.
Capturing the raw video
Apple clearly states that if you install OS X Yosemite, and connect your device via USB cable, you can use the Quicktime Player application to capture the video in it’s rawest format.
This is actually, really easy. Once you have your device (for me either an iPad Air or an iPhone 5s) connected, run Quicktime Player.
If you’re lucky, Quicktime Player will automatically find your device and you’ll see the devices screen pop up in a window on your Mac. If not, from the File menu, select “New Movie Recording”:
This will, if it still doesn’t find your device, open a window that looks like:
Within this, click on the down arrow next to the record button, and ensure that your device is selected. Be sure also to make sure that your device is also selected as the “microphone”. This is important so that any sound your app produces is also recorded along with the video.
Once you’ve done this, you should see your devices screen within a window on your Mac:
So now all you need to do is click on the record button, and then use your app while the Mac records everything you do. Be sure to have a script to follow so that you cover everything you want to show off about your app within your App Preview. Apple has a great podcast within the WWDC 2014 collection that covers this sort of thing.
TIP:
If your app has background music, turn it off before you record your video. It will be much easier to add the music to the edited video later on. In all likelihood you will end up chopping your video up into pieces and rearranging things to make best use of the 30 seconds. When you do this, any sound that is running along in the background also gets chopped up and ruins the continuity of the video. I’ve learnt this the hard way.
Once you’ve finished playing with the app, click on the record button to stop recording.
You then need to save the recording to somewhere suitable from the File menu in Quicktime player.
Editing your video
Now that you have your raw video (I like to name them with the word “raw” in the filename for clarity), you need to edit it, and turn it into a 30 second masterpiece complete with any flashy effects, music or voiceovers. For this, I use iMovie. It’s actually a pretty decent movie editor and I find that I can do most things I need within it. There are other far more capable video editors available, but for me, iMovie works well.
I won’t go into too much detail on how to use iMovie to do the editing, but I’ll address those issues that are important to getting your video ready for iTunes Connect. Here are the basic steps I follow:
Start off, by starting iMovie, and creating a new event for your App Previews.
With the new event selected, click on “Create” button and select “Movie”, and then select “No Theme”.
Click on “Create” and enter an appropriate name for the project. I tend to follow a convention where if my raw file is “Appname Phone raw.mp4”, my iMovie project is called “Appname Phone IM”.
At this point you have a blank time line. Drag your raw video file into this timeline and get creative!
When you’re happy with your 30 seconds of blockbuster video masterpiece (be absolutely certain it’s less than 30 seconds!) it’s time to export the project to a file. Click on “Share”
Next you are prompted to name your video file and select the resolution. For an iPhone 5, it is vital that you select “HD 720P 1280 x 720”. For an iPad, select “HD 1080P (1920 x 1080)”
That is the end of the iMovie phase of the project. You can now close iMovie if you are feeling confident.
Final Step – Encoding and Cropping
As I mentioned earlier on, Apple have very specific requirements about the resolution of the video you upload for an App Preview. To reiterate:
For the iPad, videos must be 1200 x 900 pixels
For the iPhone 5, videos must be 1136 x 640 pixels
So what gives, Apples own tools don’t give us a video that matches these at all! Using Quicktime Player and iMovie we end up with:
1920 x 1200 instead of 1200 x 900, and
1280 x 720 instead of 1136 x 640
Well, as it turns out, the resolution of the iPhone 5 video is almost the exact same aspect ratio as what Apple requires, and we can scale the iPad video such that we get the full height, and then crop the sides to get the exact video resolution we want.
Don’t understand that? Don’t worry, I didn’t at first either; it took me a while to realise that I needed to scale the videos from iMovie to fit what Apple wanted. For the iPhone this is trivial, but for the iPad scaling wasn’t enough because it left ugly black bands down each side. Here is a screenshot of an iPad video I produced in iMovie a week or two ago:
How do we fix this? I use a wonderful tool called HandBrake (http://handbrake.fr). This great application which runs on OS X, Windows and Linux is a free video transcoder. It has a wealth of features and does what we need beautifully.
One great feature of HandBrake is that you can create a “preset” which describes how you want your output to be encoded and sized, and then save that preset for later use. I’ve done this for both the iPhone 5 and iPad App Previews, and this has worked perfectly for me every time.
One huge benefit is that the quality of the final video is great, and it’s also very well compressed. You can probably tune things but for me these presets work.
So, start up HandBrake, and from the “Presets” menu, select import to import the presets I’ve provided below:
Now, drag the video you saved from iMove onto the Handbrake main window (or select it from the “Open” dialog) and you should see all sorts of details about your video:
Note on the side in the Presets “drawer”, I’ve selected iPhone5? This is one of the presets I’ve made available above.
Note also at the bottom where you can see the resolution of the source video (the one from iMovie) and the resolution of the output (what you have to upload to iTunes Connect). The output is exactly what you want.
All you need to do now is click on the “Start” button and wait for HandBrake to tell you it’s finished.
You’re done! You now have a video in the correct format, duration and size for a direct upload to iTunes Connect. Remember that you need to do this with Safari on a Yosemite Mac (as far as I know).