Out of memory exception when decrypt large file using Cipher

Kai

I was trying to implement an encrypt/decrypt program using classes under javax.crypto and file streams for input/output. To limit the memory usage, I run with -Xmx256m parameter.

It works fine with encryption and decryption with smaller files. But when decrypt a huge file (1G in size), there is an out of memory exception:

java.lang.OutOfMemoryError: Java heap space
    at java.util.Arrays.copyOf(Arrays.java:3236)
    at java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:118)
    at java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93)
    at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:153)
    at com.sun.crypto.provider.GaloisCounterMode.decrypt(GaloisCounterMode.java:505)
    at com.sun.crypto.provider.CipherCore.update(CipherCore.java:782)
    at com.sun.crypto.provider.CipherCore.update(CipherCore.java:667)
    at com.sun.crypto.provider.AESCipher.engineUpdate(AESCipher.java:380)
    at javax.crypto.Cipher.update(Cipher.java:1831)
    at javax.crypto.CipherOutputStream.write(CipherOutputStream.java:166)

Here is the decrypt code:

private final int _readSize = 0x10000;//64k

...

GCMParameterSpec gcmParameterSpec = new GCMParameterSpec(gcmTagSize, iv);
Key keySpec = new SecretKeySpec(key, keyParts[0]);
Cipher decCipher = Cipher.getInstance("AES/GCM/PKCS5Padding");

decCipher.init(Cipher.DECRYPT_MODE, keySpec, gcmParameterSpec);

try (InputStream fileInStream = Files.newInputStream(inputEncryptedFile);
    OutputStream fileOutStream = Files.newOutputStream(outputDecryptedFile)) {
    try (CipherOutputStream cipherOutputStream = new CipherOutputStream(fileOutStream, decCipher)) {
        long count = 0L;
        byte[] buffer = new byte[_readSize];

        int n;
        for (; (n = fileInStream.read(buffer)) != -1; count += (long) n) {
            cipherOutputStream.write(buffer, 0, n);
        }
    }
}

The key parameters like gcmTagSize and iv are read from a key file, and it works fine with smaller files, like some one around size of 50M.

As I understand, every time there are only 64k data passed to decipher, why it runs out of heap memory? How can I avoid this?

Edit:

Actually I have tried with 4k as buffer size, failed with same exception.

Edit 2:

With more testing, the max file size it can handle is around 1/4 of the heap size. Like, if you set -Xmx256m, files bigger than 64M will fail to decrypt.

Michael Fehr

The bad news are: IMHO the error is caused by an bad implementation of AES GCM-mode in native Java. Even if you could get it to work you will find that the decryption of a large file (1 GB or so) will take a lot of time (maybe hours ?). But there are good news: you could/should use BouncyCastle as service provider for your decryption task - that way the decryption will work and it's much faster.

The following full example will create a sample file of 1 gb size, encrypts it with BouncyCastle and later decrypts it. In the end there is a file compare to show that plain and decrypted file contents are equal and the files will be deleted. You need temporary a total of more than 3 GB free space on your device to run this example.

Using a buffer of 64 KB I'm running this example with this data:

Milliseconds for Encryption: 14295 | Decryption: 16249

A buffer of 1 KB is a little bit slower on encryption side but much slower on decryption task:

Milliseconds for Encryption: 15250 | Decryption: 21952

A last word regarding your cipher - "AES/GCM/PKCS5Padding" is not existing and "available" in some implementations but the real used algorithm is "AES/GCM/NoPadding" (see Can PKCS5Padding be in AES/GCM mode? for more details).

import org.bouncycastle.jce.provider.BouncyCastleProvider;
import javax.crypto.BadPaddingException;
import javax.crypto.Cipher;
import javax.crypto.IllegalBlockSizeException;
import javax.crypto.NoSuchPaddingException;
import javax.crypto.spec.GCMParameterSpec;
import javax.crypto.spec.SecretKeySpec;
import java.io.*;
import java.nio.file.Files;
import java.security.*;
import java.util.Arrays;

public class GcmTestBouncyCastle {
    public static void main(String[] args) throws IOException, NoSuchPaddingException, InvalidAlgorithmParameterException,
            NoSuchAlgorithmException, IllegalBlockSizeException, BadPaddingException, NoSuchProviderException, InvalidKeyException {
        System.out.println("Encryption & Decryption with BouncyCastle AES-GCM-Mode");
        System.out.println("https://stackoverflow.com/questions/61792534/out-of-memory-exception-when-decrypt-large-file-using-cipher");
        // you need bouncy castle, get version 1.65 here:
        // https://mvnrepository.com/artifact/org.bouncycastle/bcprov-jdk15on/1.65
        Security.addProvider(new BouncyCastleProvider());
        // setup files
        // filenames
        String filenamePlain = "plain.dat";
        String filenameEncrypt = "encrypt.dat";
        String filenameDecrypt = "decrypt.dat";
        // generate a testfile of 1024 byte | 1 gb
        //createFileWithDefinedLength(filenamePlain, 1024);
        createFileWithDefinedLength(filenamePlain, 1024 * 1024 * 1024); // 1 gb
        // time measurement
        long startMilli = 0;
        long encryptionMilli = 0;
        long decryptionMilli = 0;
        // generate nonce/iv
        int GCM_NONCE_LENGTH = 12; // for a nonce of 96 bit length
        int GCM_TAG_LENGTH = 16;
        int GCM_KEY_LENGTH = 32; // 32 = 256 bit keylength, 16 = 128 bit keylength
        SecureRandom r = new SecureRandom();
        byte[] nonce = new byte[GCM_NONCE_LENGTH];
        r.nextBytes(nonce);
        // key should be generated as random byte[]
        byte[] key = new byte[GCM_KEY_LENGTH];
        r.nextBytes(key);
        // encrypt file
        startMilli = System.currentTimeMillis();
        encryptWithGcmBc(filenamePlain, filenameEncrypt, key, nonce, GCM_TAG_LENGTH);
        encryptionMilli = System.currentTimeMillis() - startMilli;
        startMilli = System.currentTimeMillis();
        decryptWithGcmBc(filenameEncrypt, filenameDecrypt, key, nonce, GCM_TAG_LENGTH);
        decryptionMilli = System.currentTimeMillis() - startMilli;
        // check that plain and decrypted files are equal
        System.out.println("SHA256-file compare " + filenamePlain + " | " + filenameDecrypt + " : "
                + Arrays.equals(sha256File(filenamePlain), sha256File(filenameDecrypt)));
        System.out.println("Milliseconds for Encryption: " + encryptionMilli + " | Decryption: " + decryptionMilli);
        // clean up with files
        Files.deleteIfExists(new File(filenamePlain).toPath());
        Files.deleteIfExists(new File(filenameEncrypt).toPath());
        Files.deleteIfExists(new File(filenameDecrypt).toPath());
    }

    public static void encryptWithGcmBc(String filenamePlain, String filenameEnc, byte[] key, byte[] nonce, int gcm_tag_length)
            throws IOException, NoSuchAlgorithmException, NoSuchPaddingException, InvalidKeyException,
            InvalidAlgorithmParameterException, IllegalBlockSizeException, BadPaddingException, NoSuchProviderException {
        Cipher cipher = Cipher.getInstance("AES/GCM/NoPadding", "BC");
        SecretKeySpec keySpec = new SecretKeySpec(key, "AES");
        GCMParameterSpec gcmSpec = new GCMParameterSpec(gcm_tag_length * 8, nonce);
        cipher.init(Cipher.ENCRYPT_MODE, keySpec, gcmSpec);

        try (FileInputStream fis = new FileInputStream(filenamePlain);
             BufferedInputStream in = new BufferedInputStream(fis);
             FileOutputStream out = new FileOutputStream(filenameEnc);
             BufferedOutputStream bos = new BufferedOutputStream(out)) {
            //byte[] ibuf = new byte[1024];
            byte[] ibuf = new byte[0x10000]; // = 65536
            int len;
            while ((len = in.read(ibuf)) != -1) {
                byte[] obuf = cipher.update(ibuf, 0, len);
                if (obuf != null)
                    bos.write(obuf);
            }
            byte[] obuf = cipher.doFinal();
            if (obuf != null)
                bos.write(obuf);
        }
    }

    public static void decryptWithGcmBc(String filenameEnc, String filenameDec, byte[] key, byte[] nonce, int gcm_tag_length)
            throws IOException, NoSuchAlgorithmException, NoSuchPaddingException, InvalidKeyException,
            InvalidAlgorithmParameterException, IllegalBlockSizeException, BadPaddingException, NoSuchProviderException {
        try (FileInputStream in = new FileInputStream(filenameEnc);
             FileOutputStream out = new FileOutputStream(filenameDec)) {
            //byte[] ibuf = new byte[1024];
            byte[] ibuf = new byte[0x10000]; // = 65536
            int len;
            Cipher cipher = Cipher.getInstance("AES/GCM/NoPadding", "BC");
            SecretKeySpec keySpec = new SecretKeySpec(key, "AES");
            GCMParameterSpec gcmSpec = new GCMParameterSpec(gcm_tag_length * 8, nonce);
            cipher.init(Cipher.DECRYPT_MODE, keySpec, gcmSpec);
            while ((len = in.read(ibuf)) != -1) {
                byte[] obuf = cipher.update(ibuf, 0, len);
                if (obuf != null)
                    out.write(obuf);
            }
            byte[] obuf = cipher.doFinal();
            if (obuf != null)
                out.write(obuf);
        }
    }

    // just for creating a large file within seconds
    private static void createFileWithDefinedLength(String filenameString, long sizeLong) throws IOException {
        RandomAccessFile raf = new RandomAccessFile(filenameString, "rw");
        try {
            raf.setLength(sizeLong);
        } finally {
            raf.close();
        }
    }

    // just for file comparing
    public static byte[] sha256File(String filenameString) throws IOException, NoSuchAlgorithmException {
        byte[] buffer = new byte[8192];
        int count;
        MessageDigest md = MessageDigest.getInstance("SHA-256");
        BufferedInputStream bis = new BufferedInputStream(new FileInputStream(filenameString));
        while ((count = bis.read(buffer)) > 0) {
            md.update(buffer, 0, count);
        }
        bis.close();
        return md.digest();
    }
}

Collected from the Internet

Please contact [email protected] to delete if infringement.

edited at
0

Comments

0 comments
Login to comment

Related

From Dev

Compress large file using SharpZipLib causing Out Of Memory Exception

From Dev

Out of memory exception when using xlsx module with large files

From Dev

Writing Large File To Disk Out Of Memory Exception

From Dev

How to Decrypt file with out store into system memory using java?

From Dev

Out of Memory Exception when handling large files in C#

From Dev

out of memory when saving a large string data into a text file using open() function

From Dev

Process out of memory error using nodejs when nothing should be that large

From Dev

Out of memory exception when loading tif file in a picture box

From Dev

Out of memory exception when loading tif file in a picture box

From Dev

ADO Large DataSet throws out of memory exception

From Dev

Perl "out of memory" with large text file

From Dev

Large File - Adding Lines - Out Of Memory

From Dev

Out of memory when creating large number of relationships

From Dev

Rapidminer - Out of memory when working on large datasets

From Dev

Rapidminer - Out of memory when working on large datasets

From Dev

Out of memory exception reading and writing text file

From Dev

Uploading file to server throws out of memory exception

From Dev

Hibernate out of memory exception while processing large collection of elements

From Dev

C++ allocating large array on heap gives "out of memory exception"

From Dev

C# Large JSON to string causes out of memory exception

From Dev

Entity framework large data set, out of memory exception

From Dev

Hibernate out of memory exception while processing large collection of elements

From Dev

EF - Query Large Data Sets Causes Out Of Memory Exception

From Dev

"Out of memory" error when using TTask.Run for a large number of job running in parallel

From Dev

Download large files using large byte array causes "Out of memory"

From Dev

Out of memory exception while using threads

From Dev

Out Of Memory Exception - Using SqlDataReader and OpenXML

From Dev

Out of memory exception using PdfSharp in a azure webjob

From Dev

Encrypt and decrypt large file with AES

Related Related

  1. 1

    Compress large file using SharpZipLib causing Out Of Memory Exception

  2. 2

    Out of memory exception when using xlsx module with large files

  3. 3

    Writing Large File To Disk Out Of Memory Exception

  4. 4

    How to Decrypt file with out store into system memory using java?

  5. 5

    Out of Memory Exception when handling large files in C#

  6. 6

    out of memory when saving a large string data into a text file using open() function

  7. 7

    Process out of memory error using nodejs when nothing should be that large

  8. 8

    Out of memory exception when loading tif file in a picture box

  9. 9

    Out of memory exception when loading tif file in a picture box

  10. 10

    ADO Large DataSet throws out of memory exception

  11. 11

    Perl "out of memory" with large text file

  12. 12

    Large File - Adding Lines - Out Of Memory

  13. 13

    Out of memory when creating large number of relationships

  14. 14

    Rapidminer - Out of memory when working on large datasets

  15. 15

    Rapidminer - Out of memory when working on large datasets

  16. 16

    Out of memory exception reading and writing text file

  17. 17

    Uploading file to server throws out of memory exception

  18. 18

    Hibernate out of memory exception while processing large collection of elements

  19. 19

    C++ allocating large array on heap gives "out of memory exception"

  20. 20

    C# Large JSON to string causes out of memory exception

  21. 21

    Entity framework large data set, out of memory exception

  22. 22

    Hibernate out of memory exception while processing large collection of elements

  23. 23

    EF - Query Large Data Sets Causes Out Of Memory Exception

  24. 24

    "Out of memory" error when using TTask.Run for a large number of job running in parallel

  25. 25

    Download large files using large byte array causes "Out of memory"

  26. 26

    Out of memory exception while using threads

  27. 27

    Out Of Memory Exception - Using SqlDataReader and OpenXML

  28. 28

    Out of memory exception using PdfSharp in a azure webjob

  29. 29

    Encrypt and decrypt large file with AES

HotTag

Archive