Limiting factors for very large image imports?

I have to import very large images into DAM 5.5 (~2GB) in the following environment: OS-X 10.6.8, Postgresql 9.1, Java 1.6.0_26 64-bit. JVM has 3.5GB allocated with 512MB of perm. Postgresql shared buffers are set to 1GB and the shmmax has been allocated with sysctl.

When importing files of 2GB+ into the VCS what Nuxeo/JVM/Postgres settings will determine the maximum size of file that can be imported.

Should add that I am running the import from the command line, in the form…

curl -uUser:Password "http://localhost:8080/nuxeo/site/damImporter/run?inputPath=/path/to/import&importFolderTitle=stuff"

Cheers, Bruce.

0 votes

1 answers



Importing large files into Nuxeo not depends of any settings that you have mentioned (Nuxeo / JVM / Postgres).

When you download a file into Nuxeo, this one is stored directly on the server's file system and does not depends on the RAM.

So increase the RAM settings for Nuxeo / JVM / Postgres has no influence, you just need to take care that you have enough space on your file system!

1 votes

Yes, if something loads all the file in memory then it's a bug. We've cleaned up a number of things in this area recently, see NXP-8642.

Thanks Florent - I hadn't flagged this as the answer yet because I think there may still be an issue with large images and Imagemagick behaving badly with files > 2GB. I'm not sure its memory related after my latest tests, however, I haven't entirely ruled it out either.

Was the DAM code-base considered when looking for issues with 2GB+ files (as seen in NXP-8642)?


No I don't think NXP-8642 covered DAM code.