Quantcast
Channel: Ionic Framework - Ionic Forum
Viewing all articles
Browse latest Browse all 48977

Reading large files leads to application crash

$
0
0

I’m making a new app that uses a map as it’s core and the user can import some geographic data from KMLs and KMZs. Most of the time the app is intended to be used without internet connection. The user can import some zip files (KMZ) that contain geographic data which will be used within the map, some of these zips are unzipped in a single text file (KML) that have 300MB+, the data inside these files are then read to be load inside the map.

In the beginning I used Ionic Capacitor Filesystem plugin for reading the this unzipped text file and writing a new modified file in the a specific device’s FS folder, but I started to get some Out of Memory errors while doing this. After some research, I manage to solve the writing problem using a chunk-writing method, which uses way less memory.

My problem now is the reading part, which also crash with out-of-memory error if I try to use the conventional Capacitor read method.

My main problems now are:
1 – Most of the alternatives are using different types of data, like Blob or File object, which are not easy to obtain with Capacitor FileSystem.
2 – Because of this I am forced to do some conversions, which involve reading the file as a whole, which always going to lead to an out of memory crash.

This is what I’m trying to do now, based on this Gist:

readBigFile(file: File, options) {
        var EventEmitter = require("events").EventEmitter

        // file - The file to read from
        // options - Possible options:
        // type - (Default: "Text") Can be "Text" or "ArrayBuffer" ("DataURL" is unsupported at the moment - dunno how to concatenate dataUrls
        // chunkSize - (Default: 64K) The number of bytes to get chunks of
        // Returns an EventEmitter that emits the following events:
        // data(data) - Returns a chunk of data
        // error(error) - Returns an error. If this event happens, reading of the file stops (no end event will happen).
        // end() - Indicates that the file is done and there's no more data to read
        // derivedfrom here http://stackoverflow.com/questions/14438187/javascript-filereader-parsing-long-file-in-chunks

        var emitter = new EventEmitter()

        if (options === undefined) options = { }
        if (options.type === undefined) options.type = "ArrayBuffer"
        if (options.chunkSize === undefined) options.chunkSize = 1024 * 1024 * 2

        var offset = 0, method = 'readAs' + options.type//, dataUrlPreambleLength = "data:;base64,".length

        var onLoadHandler = function (evt) {
            if (evt.target.error !== null) {
                emitter.emit('error', evt.target.error)
                return;
            }

            var data = evt.target.result

            offset += options.chunkSize
            emitter.emit('data', data)
            if (offset >= file.size) {
                emitter.emit('end')
            } else {
                readChunk(offset, options.chunkSize, file)
            }
        }

        var readChunk = function (_offset, length, _file) {
            var r = new FileReader()
            var blob = _file.slice(_offset, length + _offset)
            r.onload = onLoadHandler
            r[method](blob)
        }

        readChunk(offset, options.chunkSize, file)

        return emitter

    }

The problem with this approach is that I need the file as a Blob to read with chunks, but to convert the file to Blob I need to read it first, which return me to out-of-memory error.

I have tried some different approaches too:
1 - A plugin called read-chunk, but wasn’t exactly what I needed, the plugin read a part of the file using a start position and file’s length, at this point I tried to use it to read the file piece by piece, but the write algorithm needs a whole object to be able to write chunk by chunk, and this object need to be Blob or File. That generate one more problem, the read-chunk plugin returns a Buffer that I need to convert before writing, and I can’t convert a single piece and send do the write algorithm, because in the end I will have many files and not a whole file.

2 posts - 2 participants

Read full topic


Viewing all articles
Browse latest Browse all 48977

Trending Articles