Many solid modeling applications require information not only about the geometry of an object but also about its ``topology'' -- the connectivity of its faces, edges, and vertices. Many interchange formats do not provide this information; therefore the application must derive it as it builds its own topological data structure from an unorganized list of triangles. For very large data sets, the data structure itself can be bigger than core memory, and a naive algorithm for building it can become prohibitively slow due to memory thrashing.
In this talk, I describe a new out-of-core algorithm for building a topological data structure from very large, unstructured data sets. While determining connectivity relationships, our algorithm re-orders reads and writes to make the majority entirely sequential, dividing hash tables and sorts into pieces that fit in memory, and partitioning the remaining data accordingly. This allows us to write all of the information that needs to be recorded in the final data structure sequentially at creation time, and we never need to go back and modify entities that have already been written out to disk. This algorithm eliminates the need to stitch pieces of the geometry back together, along boundaries that might not themselves fit in memory, as conventional geometric partitioning would require.
Snacks will be provided.
See Conundrum Talks for more information about this series.