Skip to content

Allocation failed - JavaScript heap out of memory #815

@Lusitaniae

Description

@Lusitaniae

Expected behavior

Running the export would not cause out of memory errors and complete successfully.

This is being run from a 4GB CI/CD instance, not sure how much data we're keeping in the firebase collections as GCP doesn't show much.

Actual behavior

Starting Export 🏋️
Retrieving documents from collectionA
Retrieving documents from collectionB
Retrieving documents from collectionC
<--- Last few GCs --->
[59:0x55e45951f140]   320807 ms: Mark-sweep 1925.1 (1958.9) -> 1918.9 (1961.5) MB, 3710.1 / 0.0 ms  (average mu = 0.109, current mu = 0.024) allocation failure scavenge might not succeed
[59:0x55e45951f140]   324495 ms: Mark-sweep 1927.2 (1977.5) -> 1920.9 (1978.7) MB, 3601.3 / 0.0 ms  (average mu = 0.068, current mu = 0.024) allocation failure scavenge might not succeed
<--- JS stacktrace --->
FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
/bin/bash: line 151:    59 Aborted                 (core dumped) firestore-export --accountCredentials $GOOGLE_APPLICATION_CREDENTIALS --backupFile export.json --prettyPrint

Steps to reproduce the behavior

Try to export a large collection(s)?

Workaround

Increase node memory limit can work aroudn the issue, but ideally we don't need to hold all data in RAM?

export NODE_OPTIONS=--max_old_space_size=4096

More details

Running the export on my laptop (higher specs), I can see the resulting file size is only 27M. Kind of surprising it needs multiple Gb of RAM to run online

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions