Running out of memory while compiling with GHC

I recently upgraded my app to GHC 8.4 and I'm attempting to deploy the app to production. Unfortunately, when I try to compile the app on the continuous integration server, it runs out of memory every time. (This app has been compiling on this same service without problems for years.) Is there any way to configure GHC/Stack to make compilation use fewer resources? It's okay if it makes compilation slower.

For context, here is where the compiler crashes:

Progress 171/271: aeson-1.2.4.0                               aeson-1.2.4.0: copy/register
Progress 171/271: aeson-1.2.4.0                               Progress 172/271

--  While building custom Setup.hs for package Cabal-2.2.0.1 using:
      /root/.stack/setup-exe-cache/x86_64-linux/Cabal-simple_mPHDZzAJ_2.2.0.1_ghc-8.4.3 --builddir=.stack-work/dist/x86_64-linux/Cabal-2.2.0.1 build --ghc-options " -ddump-hi -ddump-to-file -fdiagnostics-color=always"
    Process exited with code: ExitFailure (-9) (THIS MAY INDICATE OUT OF MEMORY)

ETA: This question has been flagged as duplicate of Cabal install criterion out of memory, but unfortunately that solution doesn't help me. If I change stack build to stack build --ghc-options '+RTS -M1500M -RTS', that only seems to make the crash happen slightly earlier:

Cabal-2.2.0.1: copy/register
Progress 171/271: Cabal-2.2.0.1                               Progress 172/271

--  While building custom Setup.hs for package tzdata-0.1.20180501.0 using:
      /root/.stack/setup-exe-cache/x86_64-linux/Cabal-simple_mPHDZzAJ_2.2.0.1_ghc-8.4.3 --builddir=.stack-work/dist/x86_64-linux/Cabal-2.2.0.1 build --ghc-options " -ddump-hi -ddump-to-file -fdiagnostics-color=always"
    Process exited with code: ExitFailure (-9) (THIS MAY INDICATE OUT OF MEMORY)