Export of large amounts of data with JMSSerializerBundle

smartcoderx

If i try to export large amounts of data with JMSSerializerBundle i get the following error.

FatalErrorException: Error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 1332351 bytes) in /var/www/app/trunk/vendor/symfony/symfony/src/Symfony/Component/HttpKernel/DataCollector/DataCollector.php line 27

If i export only few records with this bundle, everything works fine.

$format = 'json';
$serializer = \JMS\Serializer\SerializerBuilder::create()->build();
$serializer->serialize($data, $format, SerializationContext::create()->enableMaxDepthChecks());

The count of array $data is 1917

How can i handle this issue?

Martin Fasani

Try to do somewhere in your script a:

echo ini_get('memory_limit'); // To see how much memory you have

And then update the memory_limit settings in the php.ini file. Restart the server and try it again.

Collected from the Internet

Please contact [email protected] to delete if infringement.

edited at
0

Comments

0 comments
Login to comment

Related

From Dev

STL containers and large amounts of data

From Dev

Is there an alternative to AtomicReferenceArray for large amounts of data?

From Dev

Form not submitting with large amounts of data

From Dev

counting and subtotaling large amounts of data

From Dev

Mule Aggregate large amounts of data

From Dev

STL containers and large amounts of data

From Dev

Managing Large amounts of Data in Javascript

From Dev

Is there an alternative to AtomicReferenceArray for large amounts of data?

From Dev

Form not submitting with large amounts of data

From Dev

Locally store large amounts of data

From Dev

How to delete large amounts of data from Foxpro

From Dev

Faster calculation for large amounts of data / inner loop

From Dev

Chrome extension store large amounts of data

From Dev

R: automatically copy large amounts of data

From Dev

Method for copying large amounts of data in C#

From Dev

WPF controls performance issues with large amounts of data

From Dev

How to Stream Through Large Amounts of Twitter Data?

From Dev

BigCommerce PHP API - Pulling large amounts of data

From Dev

Set up a database to import large amounts of data

From Dev

Improving performance of dictionary lookup for large amounts of data

From Dev

UI for displaying large amounts of hierachical data

From Dev

Deleting large amounts of data from MongoDB

From Dev

Spring Batch - Processing Large Amounts of Data

From Dev

SQLite vs Core Data: saving large amounts of data

From Dev

SQLite vs Core Data: saving large amounts of data

From Dev

Is it feasible to unpickle large amounts of data in a Python web app that needs to be scalable?

From Dev

Read from a Java InputStream with very large amounts of data multiple times

From Dev

What's a good way to store large amounts of static data?

From Dev

Slow insert performance with large amounts of data (SQL Server / C#)

Related Related

  1. 1

    STL containers and large amounts of data

  2. 2

    Is there an alternative to AtomicReferenceArray for large amounts of data?

  3. 3

    Form not submitting with large amounts of data

  4. 4

    counting and subtotaling large amounts of data

  5. 5

    Mule Aggregate large amounts of data

  6. 6

    STL containers and large amounts of data

  7. 7

    Managing Large amounts of Data in Javascript

  8. 8

    Is there an alternative to AtomicReferenceArray for large amounts of data?

  9. 9

    Form not submitting with large amounts of data

  10. 10

    Locally store large amounts of data

  11. 11

    How to delete large amounts of data from Foxpro

  12. 12

    Faster calculation for large amounts of data / inner loop

  13. 13

    Chrome extension store large amounts of data

  14. 14

    R: automatically copy large amounts of data

  15. 15

    Method for copying large amounts of data in C#

  16. 16

    WPF controls performance issues with large amounts of data

  17. 17

    How to Stream Through Large Amounts of Twitter Data?

  18. 18

    BigCommerce PHP API - Pulling large amounts of data

  19. 19

    Set up a database to import large amounts of data

  20. 20

    Improving performance of dictionary lookup for large amounts of data

  21. 21

    UI for displaying large amounts of hierachical data

  22. 22

    Deleting large amounts of data from MongoDB

  23. 23

    Spring Batch - Processing Large Amounts of Data

  24. 24

    SQLite vs Core Data: saving large amounts of data

  25. 25

    SQLite vs Core Data: saving large amounts of data

  26. 26

    Is it feasible to unpickle large amounts of data in a Python web app that needs to be scalable?

  27. 27

    Read from a Java InputStream with very large amounts of data multiple times

  28. 28

    What's a good way to store large amounts of static data?

  29. 29

    Slow insert performance with large amounts of data (SQL Server / C#)

HotTag

Archive