![]() I experienced no problems using dumpdata to export a 160,000 record fixture, in both json and yaml formats. It appears the latest patch has already been applied to trunk. Is there anything else where I could go wrong Whats the format with the least memory use. I installed the patch, but still have to watch django eat up all the memory without any output, when I try to dump my database (700.000 records):( Debug is False. I mean fixes the bugs mentioned above and there are no more bugs known to me:). It does stream and doesnt eat memory, djangodumpdatastreamedoutput3.diff works fine with 250k objects. The error is: Error: Unable to serialize database: NoneType object has no attribute strftime.Įven with djangodumpdatastreamedoutput3.diff, i still go OOM trying to dump tables with 250k rows in them to json. I have applied the latest patch djangodumpdatastreamedoutput2.diff from 102007, on the latest version from svn. I am adding a cleaned up patch that does not have the date formatting changes from my local repository. Looks pretty good from a quick skim over - someone else from this ticket want to review the patch and report back. Python process resident memory size was at about 20M all the time. I used this patch and changed () to () in djangocoremanagementcommandsdumpdata.py.Īfter that I was successfully able to dump a considerably big database (1Gb SQLite file). Whoops, please ignore the date formatting changes from the patch. Note that yaml output has the same streaming problem which is not fixed here. This should get rid of your memory problems (you are welcome to test this patch). ![]() ![]() It also uses a generator method for the object list instead of storing everything in a list beforehand. Django Dumpdata Error Unable To Serialize Database Generator Method For ![]() Perhaps theres a bug in that memory isnt being freed as it iterates. The dump consumed about 100M of my free memory while it was churning, then dumped everything to screen and the 100M became free again. ![]() Just as a point of reference, I dumped a table with 100k rows into YAML without problems.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |