Another source of mesh dependence in topology optimization

The topology optimization community has regularly employed nonlinear programming (NLP) algorithms from the operations research community. However, these algorithms are implemented in the real vector space ℝ^n instead of the proper function space where the design variable resides. In this article, we show how the volume fraction variable discretization on non-uniform meshes affects the convergence of ℝ^n based NLP algorithms. We do so by first summarizing the functional analysis tools necessary to understand why convergence is affected by the mesh. Namely, the distinction between derivative definitions and the role of the mesh-dependent inner product within the NLP algorithm. These tools are then used to make the Globally Convergent Method of Moving Asymptotes (GCMMA), a popular NLP algorithm in the topology optimization community, converge in a mesh independent fashion when starting from the same initial design. We then benchmark our algorithms with three common problems in topology optimization.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset