Mobile applications that don’t load quickly don’t succeed. We recently worked with Twitter and Microsoft on their official Twitter application for Windows Phone 7 (WP7), and the Twitter application is all about users getting their timeline items quickly.
Loading speed also affects tombstone performance; tombstone is the term used to describe when an application is shutdown because it is deactivated. Deactivation can occur for a variety of reasons: a user hits the start button; or an application uses a launcher or chooser (except for the photo browser and camera). If your application is activated, it is like starting over again. You can see me and Jaime Rodriguez talk about tombstoning and its performance at a PDC 2010 session here.
To improve loading speed, I tried several serialization techniques for Windows Phone. We’ve been sharing our Lessons Learned with the dev community, and many of our discoveries are likely to make it into vNext apps.
Here are a few of the techniques:
Avoid Using ApplicationSettings (or don’t store too much data here)
IsolatedStorageSettings.ApplicationSettings is a dictionary-based storage, and it uses XML serialization under the hood. The first time you use it, it will load all settings stored in the dictionary. This isn’t very efficient since you will get data loaded on the main UI thread that you might not need. When I used this for Twitter, I was getting 8 second load time, which is much too long for a mobile application. If you take longer than 10 seconds to load (or show UI) while activating, the OS will shut your application down.
Use File-based Storage for Settings and Cache
I quickly moved to file-based storage in isolated storage. This gave me three benefits:
- Saving and loading can be done off the main UI thread. For Twitter, settings are loaded on the main UI thread while cached data (timeline, lists, etc.) is on a worker thread.
- The load and save can be split in separate files to further reduce the amount loaded or saved. Twitter has separate files for home timeline, mentions, messages, lists, etc. These are loaded on demand.
- I can quickly change the serialization strategy and find the best one.
Use Binary Serialization
I went through several xml and json serializers, but the timeline loading still functioned like I was calling Twitter again for the data. I wanted to make sure the application really functioned like it was loading cached data, so we looked into why xml and json serialization were slow on the phone. It wasn’t isolated storage because the transfer speeds are fast, so we determined that it must be a CPU issue. Binary serialization allows us to avoid parsing and reflection, thus saving time spent on the CPU. It made a major difference for Twitter and in some cases more than 10 times faster loading times.
Implementing binary serialization is straight forward but can be tedious for a large objects. Here is how we implemented binary serialization for Twitter:
- All classes saved implement a custom interface ISerializeBinary. ISerializeBinary has Load and Save methods along with a FileVersion property.
- Load and Save methods use BinaryReader and BinaryWriter to load and save the objects.
- We wrote separate loading code for each file version since the file is read in order and not parsed like XML and JSON. Adding data to the middle of a binary file broke the existing loading code. To help branch the loading code, we used the FileVersion property that is read from the file.
- Method extensions on BinaryReader and BinaryWriter helped make the code easier to read and reuse.
Check out the sample for implementation and see how it runs on your phone. The charts below show the speeds on a pre-production LG phone.