Table of Content
Related Paper/Work
1-Neural-network based shader generation (Gaussian Material Synthesis-SIGGRAPH2018)
Description:
- a learning-based system for rapid mass-scale material synthesis.
- ====
- First, the user is presented with a gallery of materials and the assigned scores are shown in the upper left. -> Here, we learn the concept of glassy and transparent materials.
- By learning on only a few tens of high-scoring samples, our system is able to recommend many new materials from the learned distributions.
- Then, these recommendations can be used to populate a scene with materials. Typically, each recommendation takes 40-60 seconds to render with global illumination.
- In the next step, we propose a convolutional neural network that is able to predict images of these materials that are close to the ones generated via global illumination, and takes less than 3 milliseconds per image. Sometimes a recommended material is close the one envisioned by the user, but requires a bit of fine-tuning.
- To this end, we embed our high-dimensional shader descriptors into an intuitive 2D latent space where exploration and adjustments can take place without any domain expertise.
- ====
- One of our key observations is that this latent space technique can be combined with Gaussian Process Regression to provide an intuitive color coding of the expected preferences to help highlighting the regions that may be of interest.
- Furthermore, our convolutional neural network can also provide real-time predictions of these images. These predictions are close to indistinguishable from the real rendered images and are generated in real time.