In one part of my recent projects I was asked to build a web based interface for a student learning platform.
The client had specified that the app was intended for use in regional Australia, where the internet is mostly slow or non-existent and frequent API calls were not something to rely upon.
So I started work on the app, having it preload the required content and store it in localStorage. Part of the preloaded content was a dictionary of translations which ended up being very large.
When development was nearing completion, we started cross-browser testing. Chrome and FireFox both performed perfectly, but Safari couldn't get past the preloading stage.
As it turns out, Safari has a limit of 5mb on localStorage and the data being pulled down was easily going over that.
classrooms = 1577.72 KB
dictionary = 4946.14 KB
story = 59.66 KB
Total = 6583.65 KB
Since I was storing the data as JSON I thought that using some compression would be the way to go since there's a lot of duplicate strings and plain text.
As it turns out, compressing the data was easy enough and reduced the data being stored by a massive 85%.
classrooms = 617.89 KB
dictionary = 373.52 KB
story = 14.10 KB
Total = 1005.64 KB
This also has an added bonus of allowing more customer created data to come through the API without hitting that 5MB limit right away.
Here's a quick rundown of what I did.
Add dependency
yarn add lz-string
Import
window.LZString = require('lz-string');
Setter
window.setStorage = function(key, value) {
localStorage.setItem(
key,
LZString.compress(
JSON.stringify( value )
)
);
};
Getter
window.getStorage = function(key) {
return JSON.parse(
LZString.decompress(
localStorage.getItem(key)
)
);
};
And you're done. Just reference getStorage
/setStorage
instead of localStorage.getItem
/localStorage.setItem
and its compressed data all day every day.
WebSQL was looked at in conjunction with using localStorage but W3C ceased working on the specification in November 2010.