We present a deep learning approach for speeding up constrained procedural modeling. Probabilistic inference algorithms such as Sequential Monte Carlo (SMC) provide powerful tools for constraining procedural models, but they require many samples to produce desirable results. In this paper, we show how to create procedural models which learn how to satisfy constraints. We augment procedural models with neural networks: these networks control how the model makes random choices based on what output it has generated thus far. We call such a model a neurally-guided procedural model. As a pre-computation, we train these models on constraint-satisfying example outputs generated via SMC. They are then used as efficient importance samplers for SMC, generating high-quality results with very few samples. We evaluate our method on L-system-like models with image-based constraints. Given a desired quality threshold, neurally-guided models can generate satisfactory results up to 10x faster than unguided models.
from cs.AI updates on arXiv.org http://ift.tt/25h2bkt
via IFTTT
No comments:
Post a Comment