JSON Compression - Part Deux

For all those that missed the objective (my fault probably).

Posted by Malcolm Hollingsworth on

I received a LOT of feedback about this article; some good and some bad. Much of the feedback came via Twitter, which gave me the opportunity to respond.

Most of the people I engaged with understood the objective behind the idea. Many of those agreed with the comment in the article that this does not suit all needs.

Then I found out it had been posted on news.ycombinator.com and this is where the assumption wildfire started. This was disappointing as they had all missed the main point thus making a huge assumption..

The 'assumption' problem

When people read something and either misunderstand the purpose or that purpose was not pointed out clearly enough - they tend to make assumptions.

In this case I added to the confusion in a later follow up. I added figures related to GZIP compression after a Twitter suggestion. What I failed to do is point out whilst both were compressions and they could be used together when transferring data from server to client - this was NOT the point of the original article.

The point of the article was JSON compression, NOT the transfer of JSON from client to server compression.

Restating the benefit

The compression benefits are seen on the receiving end of the data transmission. The data object is now MUCH smaller, it requires MUCH less available memory on the device (or computer) whilst remaining just as easy to navigate as it would have been without the technique.

The Objectives

  1. Reduce the memory footprint of the data received.
  2. Never require the compression to be 'decompressed' once received.
  3. Access the data without significant changes to the code already in place.

Why bother?

This had to be the strangest comment I found "Why bother as gzip will handle it for you?". I hope this is based purely on the misunderstanding that this was created purely for transmission rather than client-side interaction.

The reason we should bother is to make our software much more efficient. There are many ways to achieve this, this technique is just one.


Whilst devices appear to gain more memory each new round of announcements some developers believe that increase solves the problem. That is a shame as that is simply delusional at best and incompetent at worst.

You can still buy Android devices that have too little available memory for core systems to run plus one or two normal sized apps without generating issues.

Many people are still using old devices and want to use new apps, but cannot because too many developers simply do not consider the memory footprint of their software.

Operating systems have bugs, they leak memory, they are slow to free up used memory - all of these things crash software. If your software requires less memory to run, then there is a better chance the poor choices of other software will affect your software.

Manufacturers own layers added to Android which are notorious for their problems. If your software is sitting on one of these - your problems are even worse.

A smaller and more efficient software is faster to respond to the person using it which gives it a better chance to be used again and ideally often by that person.

But data compression doesn't solve everything!

If you are agree with the statement above then you understand the problem. Data compression is only one part of the problem.

You should aim for lean, fast, efficient software. Here are just a FEW of the areas that can be reviwed to improve your software:

  • Image assets
    • Compress these using all the tools available to you, PNGs and JPEGs can be reduced by a significant amount using specialist tools.
    • Only store images in the specific resolution for the device they are on.
    • Take advantage of app-thinning techniques, name and file your assets so that only those required for a device are included.
  • Other assets
    • Do you need all of them in the software all of the time or should some be on-demand?
    • On-demand could download from a server or unzip from compressed local versions as needed.
  • Housekeeping
    • Clear in-use unused memory, there are lots of techniques for this.
    • Remove cached files as soon as possible.
    • Learn memory management techniques for each platform and utilise them.
  • Perception
    • Let people know something is happening at all times.
    • People can think your software has crashed if it downloads a big file, loading a new window - if you do not tell them you are doing that.
    • Traditional activity indicators for in-line activity.
    • Animation, motion, UI tweaks to show transitions between one state and another.

That list is tiny, but doing a couple of them can really increase the performance of your software.


I started developing software on a ZX81. A friend lent his to me for a week.

It had 1K of memory much of which was used for the; screen, no graphics, colours were limited to just black and white and a maximum of 26 variables.

I still managed to write a car racing game that had a day/night mode and also crash detection. David Horne even wrote a chess game for the computer.

Now we have seemingly so much available 'everything' that we take it for granted. The problem is so bad that developers are rarely taught any of the simple points I mentioned above or would even consider them useful.


If you are not thinking how "I can make this better, faster, more efficient, easier, more useful or more engaging" - then you have a lot to learn.

My original article JSON Compression by Rotating Data 90° is just one thing to consider.