I am skeptical about the amount of memory required to solve my model.
I have a 3D model consisting of mostly SOLID186 elements (520,936) with some SHELL181 elements (10580), for a total of 2,160,651 nodes. I also have two bonded and one frictionless contacts in the model, which add 45,366 contact elements. It is a Static Structural model.
When I run my model, I receive this error:
*** ERROR ***
There is not enough memory for the Distributed Sparse Matrix Solver to proceed using the out-of-core memory mode. The total memory required by all processes = 40993 MB. The total physical memory that is available on the system = 15435 MB. Please decrease the model size, or run this model on another system with more physical memory.
Does 40GB of memory seem reasonable for a model of this size?
Section '5.2 Types of Solvers' of the ANSYS manual says that out-of-core mode typically requires around 1 GB per million DOFs, which means that I should only need ~6.5GB (3 DOFs per node).
Let me know know if attaching my model would help.
Does the excessive memory requirement suggest that there is something wrong with my model? This this amount of required memory reasonable?
I only have two other warnings in my model, but I think that I can ignore them:
1. Element shape checking is currently inactive. Issue SHPP,ON or SHPP,WARN to reactivate, if desired.
2. Material number 29 (used by element 575553 ) should normally have at least one MP or one TB type command associated with it. Output of energy by material may not be available.