Modeling large-scale vegetation on terrain is an important and challenging task in computer games, movie production and other digital entertainment applications. In this work, we propose a novel example-based method for rapid generation of vegetation in outdoor natural environments. Its central idea is to learn the vegetation distribution on terrain via deep convolution neural networks. We first use a pre-trained deep neural network to extract rich local information from the terrain pertinent to vegetation distribution. Second, we produce the initial features of the target vegetation distribution based on patch matching and further introduce a network that generates a vegetation density map based on the initial features. Third, during the synthesis stage, we propose a procedural method to generate the vegetation distribution data corresponding to the terrain data. Our research work confirms that the image features extracted by the pre-trained deep neural network could be utilized to explore the connection between vegetation and terrain. We validate our new method over various outdoor scenes, including procedural generated scenes and scenes with manual control on tree patterns. The experimental results demonstrate that our method can rapidly produce new realistic scenes for outdoor natural environments, which relies on the mechanism of learning correlationship between vegetation distribution and terrain data.
CITATION STYLE
Zhang, J., Wang, C., Li, C., & Qin, H. (2019). Example-based rapid generation of vegetation on terrain via CNN-based distribution learning. Visual Computer, 35(6–8), 1181–1191. https://doi.org/10.1007/s00371-019-01667-w
Mendeley helps you to discover research relevant for your work.