3D Modeling : Project 2 : THEMED SCENE-BASED MODELING
31/10/2022 - 14/11/2022 / Week 10 - Week 12
3D Modeling MMD60804
NAME: Sea Hirayama
I.D: 0347596
COURSE: Bachelor of Design in Creative Media / Taylor's Design School
●Instructions
●Lecture
Week 10 :
This week, we got instruction for project 2 after the lecture. We learned about the GPU/CPU, material/texture and lighting, etc.
The role of the GPU in Blender.
As Blender is a 3DCG modelling software, it is often thought that the GPU is required to have a certain level of performance. In reality, however, Blender often deals with simple data in the process of production, and there are very few situations where difficult image processing is required.
However, even with Blender, there is one process that requires high GPU performance. That is rendering.
Rendering is the process that requires the most PU.
Rendering is the process of calculating and expressing 3D objects, backgrounds, light, air, etc., based on the criteria of "how they would look like in reality".
If you have ever seen a 3D video production scene, you may have seen something like moving clay in a colourless world. That is a 3D image before rendering, and by giving it colour, light, etc. in the rendering process, it appears to be an actual person or object rather than clay.
It is considered faster to do the calculations on a GPU and less demanding on the computer. The higher the performance of the GPU, the faster the rendering can be processed and errors are less likely to occur.
Rendering simple data is not a concern, but if, for example, you are working on a feature-length film, a low GPU performance will make the process take longer than is practically possible.
Furthermore, it is not uncommon for such works to have to be re-rendered many times due to detailed reworking. This means that rendering needs to be as fast and reliable as possible, and the higher the performance of the GPU, the better.
How to do GPU rendering in Blender.
Basic rendering method
When configuring rendering settings, click on 'Render Properties' from the Properties tab on the right to change various rendering values, including selecting the render engine.
Once you have made the settings, click on 'Render' in the top bar and select 'Render images' to render images, or 'Animation rendering' to render video
Rendering procedure
In addition, here is an explanation of how to use the GPU for rendering.
Preparing to use a GPU
To use a GPU for rendering, click on 'System' → 'CODA' from the preferences, select the GPU you want to use from the list of GPUs that appears there and save it.
Setting the calculation to GPU
Furthermore, change the device to "GPU Compute" from the Render Properties and you are ready to go. Then, if you start rendering as usual, the GPU will be used instead of the CPU for calculation.
Comparison of GPU rendering and CPU rendering
The verification results when using a Ryzen7 2700X CPU and a GTX1060 6GB GPU...
CPU-only rendering with 32 x 32 tiles: 15 minutes
GPU-only rendering with 256 x 256 tiles: time required is 11-12 minutes.
The time required for GPU only rendering with 256x256 tiles: 11-12 minutes. Furthermore, depending on the settings, there seems to be a way to use both CPU and GPU at the same time to further reduce the time, but if you have to choose between the two, it can be seen that rendering on GPU is clearly faster.
What are materials in Blender?
Material in Blender mainly means 'material'. The original meaning of material is a word with a wide range of meanings, such as "raw material" or "data". However, in 3DCG there is no such concept of material, so the assumed material of the model being created is called 'material'.
For example, if you are creating a model of a cylinder and the assumed material is bamboo, the material is "bamboo", and if you are imagining plastic, the material is "plastic". In other words, even if the model is exactly the same, the material will be different if the assumed material is different.
So far, we have introduced materials, but in 3DCG they are only on the data and not really a choice of material. Therefore, in 3DCG, it has a meaning beyond what is imagined from the image of 'material' in the real world.
For example, even if the material is bamboo, you could produce a model with a varnished surface or a matt painted finish. Materials in 3DCG have various attributes such as reflectivity, texture, softness and colour.
Other light effects, shading intensity, etc. can also be considered as components of a material.
What is a material in Blender?
Blender's real-world-aware renderer allows you to set different materials for objects (models/objects) Using Blender's built-in functions, it is also possible to create the following material textures
- Plastic
- Glass
- Wood
- Liquid
- Metal
- Ore
- Other materials not found in the real world
Blender allows you to set light reflectance, texture, hardness, colour and shade. By combining these elements and bringing them closer to the ideal object, you can create the model you want.
Depending on the combination, models that do not exist in the real world can also be created. For example, you can create substances that emit an unusual light and use them in games.
How to set up materials
In this example, we will set a material on a cube object as an example. You can set a material on the object by following these steps
- Prepare the object.
- Add a new material.
- Set the colour.
Prepare the object
First, prepare the object for which you want to set the material. You can add objects by selecting 'Add', 'Mesh' or 'Cube' on the top tab in object mode.
Adding a new material
Next, add a new material. With the cube selected in object mode, select 'Material properties' from 'Properties' on the right-hand side. Then press the '+' button.
This will cause a 'New' button to appear at the bottom of the same screen, so click on it.
Incidentally, a material has already been set for the cube that is prepared when Blender is launched. If you select it, you can move on to the material setting screen.
Set the colour.
Next, set the colour of the material. You can edit the colour settings in the 'Base colour' section of the surface at the bottom of the tab.
You can also change the 'Metallic' and 'Specular' settings under 'Base colour' for simplified material settings. For example, if you move the 'Metallic' value closer to 1, you can create a metallic surface texture, while if it is closer to 0, you can create a matte texture.
Detailed material settings are made using the 'Node' function, so these are only simple settings, but they will give you some idea of what you are getting.
Basic knowledge about Blender materials
Next, basic knowledge about materials in Blender will be explained, as knowing about materials in Blender can improve usability.
Materials can be set per polygon
In Blender, materials can be set in polygon units. For example, select the cube 'Face' in 'Edit Mode' and 'Add New Material'. You can then edit the colour to change the colour of any face only.
Incidentally, if the colour does not change, it may not be assigned properly, so click on 'Assign' in the same tab to reflect the colour.
Save the colour in the material slot
The material slot allows you to set the colour of the material that the object has. For example, as you can see from the right-hand tab in the following image, three materials have been set for a surface. It is important to know that you can have more than one material for a single surface. Even if you set more than one here, the colour will not be reflected unless you press 'Assign'.
Conversely, you can decide which colour to set with a single click, which is useful if you are not sure which colour to choose.
Materials can be shared.
A material can be reflected on multiple objects. For example, let's add a new object.
If you select the faces you want to change as usual and click 'Assign' from the material slot, you can share the material with the newly created cube.
This means that if you have a colour in the material slot, you do not need to set a new material colour, even if you have several models.
Materials that cannot be assigned can be reproduced
Even if you delete a material that is no longer required to be assigned, Blender remembers the material. Therefore, even if you have deleted it once, it will remain until you restart Blender.
Figure 1.1 The screenshot of process
Figure 1.2 The screenshot of process putting the light
Figure 1.3 The process of putting material color
Figure 1.4 The process of putting material color
What are Blender textures?
What does the word texture in Blender mean in the first place? As it turns out, a texture is a pattern that decorates the surface of a model.
The task of 3D modelling is to compose the shape of an object, which is like working with clay to create a shape, albeit in a different way of operation. Clay is only brown in colour and it would be difficult to express detailed patterns. This is also true for 3D models.
However, if a model made of clay is dried and coloured, a more detailed pattern can be expressed. This is not a three-dimensional expression, but if it is a detailed expression, it can be made to look three-dimensional.
For example, when modelling a baseball, if you draw a stitch pattern on a simple sphere, it will look as if there is a baseball there. Textures in Blender are responsible for this kind of work.
However, in Blender texturing, instead of using paint to draw the pattern, a paper with a pre-drawn pattern is pasted onto the surface of the model to express it.
If you learn to work with textures in Blender, you will be able to create these advanced models. In other words, texture work is an essential part of learning Blender.
The basics you need to know about texturing in Blender
When creating Blender textures, there are two basic things you need to know. These are the differences between 'clay creation' and 'Blender creation' as described above, so it is important to understand them correctly.
Combining objects
Models that represent the same material texture should be combined and models with different textures should be kept separate.
For example, if you are creating a model of a doll, this means that it is better to join objects for hands and fingers, and keep arms and clothes separate.
This is because when texturing in Blender, a coloured surface is wrapped around the object's surface to represent it. When wrapping a sheet of paper around a sphere, it is easy to know where to set the pattern, but when the model is a sphere with a cylinder, it is difficult to judge where to apply the texture to express the pattern nicely.
In other words, it is better to combine and split objects appropriately to reduce the difficulty of the task.
Basic knowledge of UV expansion
When creating a texture, it is necessary to determine on which side of a sheet of paper the texture will be placed. This operation is called UV expansion.
For example, if you are drawing six numbers on a cube, you could set the numbers on the other faces by determining the basis of where the number '1' will come.
For simple cube models, you can colour the faces approximately without having to decide where they should be. However, when creating complex models, the operation of deciding where the faces are to be placed can significantly change the expression, which is an important point in Blender texturing.
How to add textures in Blender.
Step 1: Adding a seam
First, start Blender and create a cube model. Then, you create a 'seam' on the model and set which parts of the model you want to detach.
A seam in Blender is an operation that determines where to cut into the cube in order to expand it. To seam, follow these steps.
- Shift+click to select multiple vertices of the model.
- Press Ctrl+E after selecting the required edges
- Mark the seam
If you do so, the selected edges turn red and are recognised as edges to be severed. If you have selected the wrong edge, press Ctrl + E in the same way with the selection made and select 'Clear seam' to unseam it.
As the shape after unfolding changes depending on how the seam is attached, it is recommended to aim for a state that is easy to manage for yourself. Incidentally, it is also possible to attach seams to all edges and then break them apart.
Step 2: UV unfolding.
Once the seams have been attached, UV development is carried out. The main steps are as follows
- Select all sides.
- Click Expand.
First, to select all surfaces, press the 'A' key on the keyboard and select all the models.
UV expansion is then carried out by clicking the 'U key' and then clicking 'Expand'.
At this stage it looks like nothing is happening, but if you open the 'UV Editing' tab at the top, you will see that you have done a good job of UV expansion. If the seams are not set properly, seam and expand again.
Step 3: Create the image
Next, create an image to colour the cube.
However, even if you start drawing the texture as it is, the image you create will not be reflected on the cube because there is no image to serve as the texture at all. Therefore, you need to create an image to be recognised by the Blender screen so that the texture is reflected. The procedure is as follows.
- Select Texture Paint at the top.
- +Click on +New
- Create the colour as white
You will then have a pure white image after UV development.
To reflect this image on the cube, you need to set the material. This operation allows the image to be reflected.
- Select 'Material' in the Properties tab
- Click on '+New'.
- Click on the round icon next to the base colour
- Select 'Image texture'.
- Click 'Select image to link'.
The cube will then be a pure white object with no colour at all.
Step 4: Draw a fine pattern on the image
Finally, draw a detailed pattern on the image.
There are two types of Blender drawing methods: "drawing itself in Blender" and "drawing using paint software". The details are described below, but here we will paint using Blender.
- Select "Active space and workspace settings" at the top of the properties
- Paint with a brush
Incidentally, in Blender, even if you paint a colour that sticks out, that part will not be reflected, so there is no problem. However, it is basically recommended to use image editing software to edit the image that is displayed flat.
Step 5: Apply the texture
If the image is written directly into Blender, the operation of applying the texture is not necessary, but it is necessary when embedding an image created with other software.
To apply an image, follow the steps below.
- Select the material properties
- Click Open image.
How to draw a detailed pattern on an image
There are two possible methods for drawing fine patterns. Each method is explained in detail.
Drawing with Blender
The first method is to draw directly in Blender. This method was discussed earlier, but in Paint you can change the thickness and strength of the brush. By making good use of this, it is possible to draw fine patterns.
However, if you want to draw more precise patterns, it is better to use external paint software, as it is equipped with a wealth of functions. For this reason, we basically recommend the 'Drawing using external software' method described below.
Drawing using external software
When drawing using external software, textures can be created using external software and the images can be applied to the material to draw fine patterns.
Ambient occlusion is Japanese for 'environmental shielding'. It is a technique that is very widely used in the CG industry, both for offline and real-time rendering, as it is relatively low-intensity but produces high-quality results with global illumination taken into account.
The shading of objects must be beautifully expressed. This cannot be achieved with a local lighting model that only considers the points of the object to be rendered, but can be achieved with a global lighting model that considers the entire scene. Ambient occlusion is one method to realise such a global lighting model.
First, the motivation for introducing ambient occlusion is explained.
First, we consider a model of how the point of interest is affected by ambient light. (The point of interest is, for example, a point in the scene corresponding to the pixel you want to render.)
In a general situation, we can think of the point of interest as being affected by two types of light One is the effect of direct light, which reaches the point of interest directly from the light source, and the other is indirect light, which reaches the point of interest after repeated reflections of direct light by surrounding objects. In this article, indirect light will be referred to as ambient light, as we are particularly interested in ambient occlusion.
Now, it is known that very realistic CG can be obtained if the influence of this ambient light is accurately taken into account. However, there is also the fact that it is very computationally expensive, as all inter-reflections of light between objects have to be simulated.
Attempts have been made for decades to calculate the effects of this indirect and ambient light at high speed and with good accuracy, both in offline and real-time rendering. And quite good results have been obtained. Even so, it is still very difficult to calculate the ambient light in its entirety, so the motivation for using ambient occlusion is to balance speed and computational cost by using a model that takes the entire scene into account to some extent (while taking global lighting into account) and has a reasonable computational speed.
Approximation of ambient light effects
The first consideration is to approximate the calculation of costly ambient light effects in some way. This part is not essentially related to ambient occlusion. For example, in real-time rendering, the calculation can be approximated using a constant value called ambient light. We may also use results from Image Based Lighting. (IBL lighting was exactly the same as inputting the ambient light distribution as an image and calculating its effect.)
In the following example, the ambient light component of the input is assumed to be constant.
Calculation of the environmental shielding term
Now, suppose that the ambient light components are determined. However, this does not take into account the surrounding points other than the point of interest. If you render the final result without taking them into account, you will say that you have rendered with a local illumination model. As mentioned above, this does not result in a very beautiful picture. Therefore, the surrounding scene is taken into account, but rendering by ambient occlusion only takes into account the degree to which the point of interest is occluded by the surrounding scene. This approximation allows high-quality results where global illumination is taken into account, at a lower cost than if the inter-reflection of light between objects is taken into account.
How the environmental shielding term is calculated is very different between offline rendering and real-time rendering.
In offline rendering, because it is possible to calculate it over a certain amount of time, the environmental shielding is often calculated by performing ray tracing from the point of interest in all directions and calculating how many hits are made with surrounding objects.
On the other hand, ray tracing is too time-consuming for real-time rendering. Therefore, the mainstream method seems to be to restore the geometry around the point of interest using a depth map, and roughly approximate the degree of shielding using Screen Space. (so-called Screen Space Ambient Occlusion).
Detailed calculation formulae are described below.
Relationship between the environmental shielding term and ambient light
This time, it was assumed that the ambient light was calculated without considering the influence of surrounding objects using the local illumination model. The environmental shielding term, on the other hand, is the degree of shielding from surrounding objects, which is basically a value between 0 (completely shielded) and 1 (not shielded at all).
Therefore, when rendering, the effect of ambient light is multiplied by the value of the environmental shielding term and the effect of direct light is added to this.
At this point, a common (?) mistake is to use (direct light effect + ambient light effect) x environmental shielding term. The environmental shading term is only a term for ambient light, i.e. indirect light, and has nothing to do with direct light. The correct term is the effect of direct light + (effect of ambient light x environmental shielding term).
(Well, this is a strictly physics-based case, so it depends on how it looks in the end.)
Figure 1.5 image of putting texture detail
IOR (Index of Refraction) Refractive Index Memo.
The Index of Refraction (IOR) is an optical unit, which describes exactly how much light bends as it propagates from one medium to another. CG software may also provide a parameter to set the IOR value.
- Vacuum 1.0
- Air 1.0003
- Alcohol 1.329
- Water 1.330
- Ice 1.333
- Acrylic resin 1.490
- Polyurethane resin 1400-1700
- Glass 1.500
- Polyethylene resin 1.530
- Emerald 1.570
- Ruby 1.770
- Sapphire 1.770
- Quartz 2.000
- Diamonds 2.419
- Zinc 2.400
- Chromium oxides 2.705
- Aluminium 2.62
- Aluminium oxide 1.76
- Cadmium 2.49
- Tungsten 2.76
- Iron 2.36
- Copper (copper oxide) 1.95 (2.71)
- Titanium 2.52
- Magnesium 1.50-1.70
- Manganese 2.50
- Platinum 2.90
- Mercury 2.95
Week 10 :
This week, after we got feedback for all assignment so far, we learned about UV Editing.
What is UV expansion?
UV expansion is a preparation for applying a texture to an object and is a technique for developing an object's mesh into a flat surface.
Why do we need to do this in the first place? A texture is like a sticker on a 3D model - without UV development, the sticker is almost like a stretched or shrunken sticker.
Even if you draw a pattern on it, it will be stretched and it will be difficult to achieve the intended look. So you need to do UV expansion to get it back to the correct shape.
If the UVs are distorted, you can see that the texture has been stretched vertically or horizontally, but by expanding the UVs, you can make it look the way you intended.
Basic procedure for UV expansion in blender
Step 1: Seam the 3D model
In order to expand the UV of a 3D model, a seam must be cut into the edges. By seaming the model correctly, UV expansion can be done with less distortion, and textures can be drawn more easily.
First, select the edge where you want to put the seam.
Once you have selected the edge, right-click on it and click 'Mark seam'!
The selected edge is now highlighted in red.
There is an item called 'Mark sharp' nearby, but it has a different function, so be careful not to confuse the two.
If you mark a sharp edge, it will be highlighted in blue instead of red.
Step 2: Switch to UV Editing and expand.
After switching to the UV Editing tab, select all 3D models for UV expansion in 'Edit Mode' and right-click on the UV screen on the left-hand side → click 'Expand'.
Step 3: Once expanded, place the UVs in the UV range.
In this case, there is no problem because only one UV fits, but you can adjust the UVs to fit within the range by selecting them and using the Move and Scale tools.
Causes and solutions for poor UV deployment
There are times when the shape clearly does not fit after UV deployment with a seam.
The shape of the model does not fit even after UVs are deployed
Even if you seam and expand the UVs correctly, the 3D model and the shape of the UVs do not match and an error message is displayed at the bottom of Blender.
In this case, check the scale value of the model
You can display the transform by pressing the 'N' key.
If a value other than '1' is entered, return to object mode and click on 'Object' -> 'Apply' -> 'Scale' in the upper part of the 3D view.
When you check the scale of the 3D model again, it is set to '1'.
In this state, expand the UVs again.
The error message should disappear and the UVs should have the same shape as the 3D model.
Selecting vertices on a UV also selects unintended places.
When you select a vertex or edge on a UV and try to edit it, it may select the location of the overlapping vertex or edge on the 3D model.
In this case, the "Synchronise UV selection" symbol with two arrows in the top left corner of the UVEditing screen should turn blue, click on it to switch it off.
After switching it off, select the vertices again.
If no other vertices are selected than the one you clicked on, you are OK.
The number and shape of the 3D model does not match in the UV expansion
If you put in a seam and press expand, but there are UVs of a model you don't recognise, check the 3D model
Sometimes, what looks like a single object may actually be several overlapping objects of the same type.
In this case, keep only the models you need and delete the ones you don't need.
●Project 2 : THEMED SCENE-BASED MODELING
Instruction :
1. You are required to model a scene based environment on the provided themed concept using various techniques you have learned. The final output will be in a diorama or isometric composition.
Search reference images or blueprints.
2. Model the selected object using polygon tools and techniques based on what you have learnt.
3. Apply the right material and texture to give its appearance.
4. Set lighting and rendering using ‘Eevee’ as the rendering engine.
5. Use Eevee rendering (Blender ) for final output. Output size is 1280 x 720 with png format.
6. Upload in your e-portfolio and update the link on TIMES platform as submission.
7. Submission is on Week 12, 14th Nov 2022.
Examples :
1. Sci-fi
2. Cyberpunk
3. Steampunk
4. Mediaeval
Figure 2.7 Examples of mediaeval
Week 9 :
For this week, I decided to create it with Japanese city theme, steampunk and neon mood. I prepared the base form and add derails.
Figure 3.1 screenshot of the process
Week 10 :
This week, I put the color on it. Also I edit the material, texture and lighting.
Figure 4.2 screenshot of the process
Week 11 :
This week I also edit the color tone and details.
Figure 5.1 The process of putting the color and details
●Feedback
Week 9 :
General feedback
This week, I could enjoy searching reference for this project in Pinterest. There's s lot of kind of diorama with detail design. I just looked forward to create it just because it's first time for me to design scenery thing.
Week 10 :
General feedback
After I finish preparing base shape, I edit with more details. It took me a lot of time but I also could learn about the buildings compose.
Week 11 :
General feedback
Finally I tried to explore the details such as poster and so on. It's really difficult for me to compose them better. Also I wanted to express night mood that is why I faced difficult which took a lot of time to design.
●Reflections
>Experience
From this project and lectures, I got learned about lighting, texture and materials more. Also I enjoyed learning about building design which is first time to think about the scene or something.
>Observations
This project was really important for me to get used to use it and get more potential of blender as well.
>Findings
I just knew about UV Editing to make it more better design.
>Further readings/references
Maya Practical Hard Surface Modelling: workflows from Prop and Background (CG Pro Insights)
Book by Eiji Kitada, a CG modeller who has worked on various projects for overseas studios. He is also famous on Twitter.
The book explains how modelling is done from concept art. It is hard surface and polygon modelling without using sculpting. He explains in detail not only the cleanliness of the simple models, but also the data management considering the workflow of the entire team.
The book discloses all the knowledge of people who have actually worked in the front line overseas. So if you are currently working as a CG artist, you are sure to find useful information.
Blender 2.8 3DCG Super Techniques
This book has a cute little girl model on the cover.
As mentioned at the beginning of the book, it explains the wide range of functions that can be performed in Blender one by one. It is a type of introductory book that does not specialise in one function such as modelling or rendering, but rather summarises what you can do when using Blender.
It is not an in-depth book, but rather a broad and shallow introduction.
However, if you want to learn about the basics of 3D from scratch, this is a book I would highly recommend to beginners because of the detailed explanations.
By reading through the whole book, you can get an idea of what you can do with Blender. Of course, it is not so easy to make a high quality model like the one on the cover just by reading this introductory book.
However, I think you will be able to understand the path to complete some models.
Free Blender CG Animation Techniques ~understand the structure of 3DCG and how to move it [Blender 2.8 compatible edition].
The introductory books introduced so far have been about Blender's overall functions, but this book specialises in 'animation'.
So this is an introductory book for people who have some modelling skills, and not recommended for people who have never touched Blender at all.
It is best suited for people who can modelling but don't know how to animate. The content itself is not too difficult, so it's a book for intermediate level users.
It also touches on things other than animation, but if you want to understand the whole picture of Blender, we recommend either of the two books mentioned above.
Comments
Post a Comment