Design/ Media Trends of the next few years.

3D

Published 13th February 2021

Every year around the time of new year articles about design trends are popping up. I wanted to do something different and list up trends or movements of which I believe they will stay for way longer than a year which is in my option much more useful information, because it enables you, to learn new things which are relevant long term.

AI Tools

More and more Software Companies like Adobe or Autodesk will create better features inside their Software driven by AI. already there are features like object recognition, and AI Filters to change faces inside of Photoshop. Another really popular AI feature is the Denoising of 3D rendered images.

There are also interred Standalone programs like Cascadeur or AI Gigapixel which are built around the idea of making something better than with the initial method with the help of AI.

Those tools and features will appear more and more often in the next years and can make the creator more powerful, but that will also raise the bar.

3D

A few years ago 3D was a remarkable and special skill I even got my first job as a Motion Designer because I had some 3D skills. But nowadays you rarely find a job description as a graphic / motion designer not including things like “basic skills in C4D or Blender is a plus” So the demand is definitely there and one day it will be the standard.

Or in other words, if you are a creator maybe it is time to learn it. Because let’s be honest during these times we all sit at home and play video games or watch pointless YouTube videos and TikTok most of the time.

Here is the Tutorial, I recommend to every beginner YouTube.com/Blender Beginner Tutorial – Part 1
And here is the Download Link for Blender Blender.org/Download

Online Events

Since the last year, many events never happened the way they were planned due to contact restrictions. Which meant for some event organizers to look for alternatives.

And since online events are extremely practical since they are relatively inexpensive and not limited by location. With more possibilities, this will possibly be part of the future especially for events such as fairs.

Facebook Alegria

Not many people heard about it but everyone saw it. Alegria is the Name of the ecosystem for Facebook Illustrations and probably one of the most copied design ecosystems on the internet.

And there are many reasons that speak for it like, it is inclusive since the people shown don’t are recognizable as humans but too abstract for anyone to feel discriminated against, The style is easy to recreate and to adapt which saved time and money and on top of that, it works really well as a simple vector file which makes it extremely easy and performant for web applications.

The downside is that everything looks more or less the same. It is just not interesting it is more like a placeholder.

The actual creator of this style is the LA-based design Agency Buck.

Less demand for simple easy things

More small business recognizes, that it is important to also be present on the web and in media which created the market for easy-to-use tools to create media such as Adobe Spark or Canva. Also, a platform such as Fiverr profited from this need.

That means there will be less demand for graphic designers doing simple things or things for small businesses that don’t really have a real marketing budget.

Holistic Design Systems

Almost every big tech company has a Design System a System that is adaptable for every situation and for every medium. Since it looks to become the norm, that companies are represented on multiple platforms and mediums there will be a need for holistic concepts and ideas.

Web micro-Interactions

Since Smartphones became advanced enough over the last few years to support almost everything that a desktop does and internet speed constantly increases The web became more of a playground for designers. Websites can be bigger and more complex which enables web developers to create a dialog instead of a monologue, a website that talks back to the user which can make communication way easier.

The in my opinion most important example for this is the message got to send after clicking send on a form because without it unsettles the user. Makes him unhappy a subconsciously connects bad feelings with the website and the website owner. I think there are a lot of examples like this in which the communication of the website can be improved through micro-interactions.

There are already people whose only job it is to create such micro-interactions even do I think currently it is more part of a UI Designers Job.

VR/AR

Currently, VR and AR are more of a gimmick or a niche. But big tech names such as Mark Zuckerberg or Tim Cook think it will be the future. I think currently this is extremely fueled by the contact restrictions and lockdowns which created the need to shift more things in the online world.

Especially for Fairs, Many clients ask for VR and Real-time solutions. And currently, it is a real niche. This means perhaps it is a good train to jump on by now and maybe build a company around this field.

Human focused Design

Human focused design is the idea of Designing something with the realistic behavior of humans in mind. It means to make text big enough so that it is easy to read, make things less boring so that people don’t get disinterested instantly especially on the internet. It is about using colors with the question of how it influences the behavior of the viewer. Overall, it is about letting psychology and knowing about human behavior influence the design.

Back to basic

It is a more general development of humanity that happened in the last 20 years that people seem, to step back a bit from the current progress, that people care more about work-life balance, reducing stress, feeling connected to nature, consume less, be for minimalistic and focus on what is important.

And I think we are seeing something similar when it comes to media. Minimalism for example is a Design Philosophie that moved into the media world roughly 10 years ago. But it is more than that. People want to spend less time on their phones and also value printed media more.

Related posts


Tips for better camera Animations

3D

Published 31th January 2021

Orientate yourself on what is possible in reality and what is

In the Article Tips for more realistic renderings in Blender, I already wrote about how important it is, to have reference images of something real for Orientation. And you can apply the same thing to your Camera Animations.

Focal Length

Make a research about Lenses which lenses are existing and what are they getting used for. If you are making an animation of a Landscape for example your focal length is maybe 12 mm which is the lens with the shortest focal length that ARRI offers.

If you are making an animation of something which is supposed to look far away like an airplane, a rocket, a car, or something else maybe you should use something like 155 mm or 280 mm which is the biggest focal length you can find on an ARRI lens.

If you use values far above or below that it will look a bit weird because such lenses don’t exist humans will subconsciously see, that there is something off.

Who is holding the Camera?

Is a person holding the Camera? Is the Camera on a tripod? Or is the Camera on a Camera Robot? Is your Camera a GoPro on the head of a Dog?

There is an endless number of possibilities of things that your camera can be connected with and all of them are having an influence on how the Camera is moving, which Camera and Lens is getting used and where your Camera is positioned. If you are having a Handheld Camera it is unlikely that the camera is 3 meters above the ground. It is more likely that it is 170 cm above the ground and Is shaking a bit (Ian Hubert made a quick Tutorial about that.)

Think about what is holding your Camera and how it will influence it will have.

Maybe your image isn’t 100% sharp and has some noise and distortions as well all things that can help you, to make your Animation more believable.

Use Motion Blur!

Motion Blur is just a klick. To set it up Literally and so helpful at the same time.

It makes the Animation feel more real because every video taken by a real camera has Motion Blur.

It helps you, to hide some mistakes and can hide some missing details.

Also, a director with who I am working every once in a while, is known for using Motion Blur as a transition effect.

Use depth of field

Depth of field also exists in every video and can help you as well to hide some details.

It is also a Tool, to guide the viewers’ attention. So put some thoughts about where to place the focus.

Often, I create an Empty Object which I use as a Focus point because it makes the animation of the focus much easier. It is possible without an extra object but that makes it harder and can cause problems if you change the animation of the camera later on.

Most people connect a strong depth of field with high quality like they connect a lot of base with high-quality sound. The reason for that is, Big Image sensors and lenses with a wide aperture which are both found on more expensive cameras and lenses cause more depth of field but because of that too much blur is a common beginner mistake. Use it but don’t overuse it.

Make use of the graph editor

Slide

A Keyframe describes a certain state at a certain Frame but no one creates a Keyframe for every frame. Which lets the question open for what is happening between both Keyframes is it transitioning from one state to another consistently or is it starting slow and ending slow? Is it starting fast and ending slow? This is what the Curves in the Graph Editor describe. In most cases, it is good if the Curves are smooth like Butter. But it really depends.

Avoid having no movement

This is a mistake I frequently see in bad product videos, where the animation just stops. And there is just a static image. In every good video, there is always some motion. It doesn’t need to be the camera but there always needs to be something moving.

Make some thought about the image composition

One thing I really like about Pixar movies is, that every Frame looks like an Image that could be printed outputted into a Picture frame. This doesn’t need to be the case but go to different frames in your Animation and ask yourself the question is the Frame itself good and how can I improve it.

There is an endless number of things and guidelines, and it is a huge topic, so I linked you a separate Article about this topic.

Chose the right frame rate

Framerates also have an impact on how an animation Feels.

6-12 Frames per second

This is the normal Framerate of an Anime. The reason for that is, that in the past every frame got drawn by hand. Which meant more Frames drastically increases the amount of time and cost that is required. Now it is more a nostalgic thing. To create this hand-drawn look.

But it also raises the question if it makes sense if more and more people aren’t able to understand this because they are simply too young to use this framerate. Because to the, it looks just choppy.

Framerates also have an impact on how an animation Feels.

23.97 Frames per second

According to Science this is the number at which the human eye sees movement instead of singles Images

24 Frames per second

The standard Frame rate in Cinema because when videos was made with film, film was expensive and more frames required more film. Which is the reason they shot slightly about the minimum requirements for human eyes. And no one saw a reason, to change that till this day.  This means if you want something to look cinematic whatever that means it is the right frame rate for you.

25 Frames per second

Is the standard Frame rate of TVs here in Europe.

30 Frames per second

It is the standard Frame rate of TV in the US and on the Internet most and the most common these days.

60 Frames per second

Makes the video feel supernatural and smooth. I think it feels fascinating and is good for a video where it is important, that people can recognize, what is happening on the screen.

Think out of the box

There are so many ways, you can create a video in a different way. No one said, that a video needs to be 16:9. Why not make upright or with multiple views at the same time. I am still waiting for the first ad agency that creates a multimillion-dollar budget video project in 9:16 and because their target group primarily uses smartphones.

But I think I know why this will take a while until it will happen.

“As I said balls, we need balls. If you know, that means”

Oliver Kahn

Related posts


Every Render-Engine for Blender

Every Render-Engine for Blender

Updated 24th April 2021

What is a Render-Engine?

A Render-Engine is an application, which takes the 3D models and all the information like light and textures and converts them into a 2D image.

Here I explained some Terms you might not know. Which will appear in this article.

Most modern Render-Engines and also the most on this list are Ray tracing Render-Engines. They work by simulating the light beams, which gets emitted by a light source bouncing around the 3d space into the Camera.

Slide

A fork is if a developer takes a preexisting software from and continues it another way.

rendered with logo of appleseed

Appleseed is the youngest Render-Engine project in the lineup. It is also an Open source Ray tracing Engine, but with the focus on VFX which means, there are fewer features for stylized renderings, but things like acoustics, Motion Blur, and Bokeh are wonderful compared to other Render-Engines. The only problem which most people will have is performance.

CGI dog following stick

Cycles is not only the default Render-Engine of Blender it is also my favorite Engine. It is by far the most versatile Engine in the list. It is also one of the quicker ones. Especially with the new Denoiser.

One thing I am noticing currently while I am switching to Octane render for the Media Production Company in which I am currently working is, that Cycles has a lot of different Nodes which allows, to create extremely complex materials like in these examples at #Nodevember.

In general, I have the feeling, that Cycles allows you, to work more freely and creatively because of the many options.
The only problems with Cycles are caustics, volumetric, and more complex scenes.

E-Cycles was the first Fork of Cycles with the goal to make it faster without a sacrifice in quality. According to the website, it is up 100% faster than normal Cycles on Nvidia GPUs. But because of the high price of $299 for the current version that supports RTX GPUs I wasn’t willing to pay that much for a fork of an already existing Software. Which is the reason I can’t say much about it.

Eevee

CGI flying car

Eevee is a result of the collaboration with Blender and Epic Games. It’s a real-time Engine based on the Unreal Engine. The reason to use Eevee clearly is performance. Not having to render for a long time will also enable those, who don’t have access to modern Hardware. I am thinking about Nollywood (No, this isn’t a Mistake I am talking about the Nigerian film industry which most people don’t know about, but it is in fact the second largest in the world.  ) there. Maybe Eevee will enable many studios from there to create animated movies.

But it is also the Render-Engine, that delivers the worst quality. And it is also limited especially when it comes to things like glass or volume.

K-Cycles is the latest big Fork of Cycles. It also has the goal of making Cycles faster. I am currently testing it, so that I can validate the information myself, but so far it looks promising. I really like the simple user interface. Which has basically one option which means there isn’t anything to learn.

Because of the performance and the compared to E-Cycles low price of $49 I think it is interesting because it doesn’t require learning new things and Graphics cards aren’t really available currently.

glass brick with caustics

The LuxCoreRender Project is the successor of the Lux Render Project. An Open Source Ray trace Render-Engine with the Focus on realism and physical accuracy. A big difference to other Render-Engines, is, that LuxCoreRender is excellent at creating caustics. Which means if you have a scene with a lot of transmission materials like glass, vodka or any fluids it can make a huge difference in realism.

The downside of course is, that it is a bit slower in most cases. Or in those without Caustics. But since it is also compatible with Intel’s  OpenImage Denoiser this isn’t a problem anymore.

The LuxCoreRneder comes with two different algorithms.

Path Tracing

Which is the same technique as in most other Render-Engines and is able to render with the CPU and multiple GPUs at the same time. Here you can also make use out of NVIDIA’s OptiX technology if you are the proud owner of an RTX Graphics card. The big difference to other Render-Engines is, the Light Tracing feature which if activated enables you to create good Caustics.

Bidirectional

To explain the difference here is really complicated, but if you want to know exactly what it does you can watch this video here. What you need to know is, that it is the most physically accurate rendering algorithm that is currently available for Blender.

Here is a Comparison between Cycles and the two Algorithms of the LuxCoreRender. One thing that is clearly noticeable is the caustics on the blue wall and how the Bidirectional Path-Tracer is able, to capture way more light which is the reason, it is brighter. In the Image, that is rendered with the bidirectional Path Tracer you can even see caustics from the pink wall on the blue wall if you look closely.

astronaut on parking lot cgi

Octane render was the first GPU Ray tracing Engine and was the fastest production renderer back then. This is the reason, it is the most popular Render-Engine outside the Blender universe. When it comes to performance it is comparable with Cycles. But Octane delivers more realism and is much better than it comes to things like volume and Subsurface scattering.

It comes with their render algorithms. Each having its own purpose.

Direct Light

This is the fastest Algorithm and delivers the lowest quality. It has what I would call Octane to look to it which some people tend to like but in my opinion, it is something bad. But it seems to be used more as an algorithm, to create previews.

Path Trace

It is comparable to Cycles. It doesn’t have a certain look to it but isn’t noticeably faster than Cycles. Except for things like volumetric or subsurface scattering (SSS). In which Octane is way faster.

PMC

Seems to be the same as Path Trace but slower. Which is not true. It uses another method, to distribute the light rays which makes it faster for very complex scenes with a lot of Glass for example. Or some indoor Scenes.

But there are also two downsides to this Render-Engine. One is, that Octane only works with Nvidia GPUs, not with AMD GPUs or CPUs in general.
The other downside is, with $699 per year very expensive. Meanwhile, there is also a Free Tier. But it only supports one GPU and no denoising.

Keep in mind, that you need an RTX GPU, to use Octanes Denoiser. But if that’s not the case you can still use the IOID Denoiser that comes within Blender since version 2.81. You can read about it in this article: Blender 2.81 new Denoiser (IOID) a real Game changer.

CGI motorbike with AMD Logo

The Radeon Pro Render is the same Render-Engines as the Pro Render which comes with Cinema 4D.

It is a Ray tracing Engine Developed by AMD. The only reason for its existence is, that it is the only GPU Render-Engines that work on Mac. This is also the reason the new Mac Pro gets advertised by rendering 6.8 times fast AMDs Pro Render. In comparison to an iMac Pro.

But the Radeon Pro Render is kinda dead by now, and you shouldn’t buy a Mac for CGI or VFX.

Workbench Engine

The Workbench Engine is already integrated into Blender, but I think it rarely gets used and doesn’t get the attention it deserves.

The Workbench Engine was actually created, For Modeling and Sculpting as a viewport. But it can also be helpful for other things like exporting something completely shadeless and unbiased like a screen in a Mock-up for example.

I am sure, that there are a lot of different ways, to make use out of it. I used it for example in a 2D explainer for a teeth cleaning product that was made in a flat style in After Effects, but the Client said, that it is too hard to understand what the object is, so he wanted, to make it rotate in instead or being completely flat. And to match the flat style I exported it with the Workbench Engine.

What about the other ones?

There are also Render-Engines like Renderman, Yafaray, Pov-Ray, and so on. But all of them are not compatible with Blender Version 2.9+ Which makes them irrelevant.

I try to keep this article updated if something should change. Currently, it looks like Pixar’s Renderman will be compatible with Blender in the new upcoming version 24. Which is planned to be released in early 2021. Also, Maxon is currently developing an integration of Redshift in Blender which is something to keep an eye on since Redshift is way faster than every other Ray Tracing Engine. I am in contact with developers from Pixar and Maxon and both Engines will appear in this list if they are out of Beta and publicly available.

Related posts


Best Ressources for 3D Artists

Best Ressources for 3D Artists

Edited 20th May 2020

Ressources for Textures

Having a source for high-quality, high-resolution textures is necessary to create photorealistic stunning renderings btw. here is my article 10 Tipps for more realistic renderings in Blender. They can also help you, to create things much faster by using a manhole texture for example instead of modeling one.

Screenshot of Textures Haven Website

Texture Haven is 100% financed by donations which is the reason there are no limitations. All textures are available in 8K which is more than enough for almost every project for every texture there are AO, Diffuse, Displacement, Normal and Roughness Maps available. The only problem here is that there are only 151 textures available but this will hopefully change soon.

Pricing: Free

Screenshot of Pixar One Twenty Eight Website

The most legendary 3D animation Studio Pixar developed a Texturepack themself which they are frequently updating. All the textures are seamless thanks to a technique they patented in 1993. I think it can be very interesting for people who are doing stylized renderings and characters.

Pricing: Free

Screenshot of Textutre Ninja Website

Texture Ninja is with over 2000 Textures the biggest and completely free Texture Library. The only downside is, that image size and quality vary and seamless textures, as well as PBR Textures, are rare.

Pricing: Free

Screenshot of CC= Textures Website

CC0 Textures is also completly free and no Account is required. It offers over 700 seamless PBR Textures.

Pricing: Free

Screenshot of 3dtextures.me Website

3dtextures.me also has a big variety of textures. all made with Substance Designer. Which means all of them are seamless. But to download them in 4K you need to become a Patreon. as a free user, you can only download them in 1K which maybe is not enough for some projects.

Pricing: 1K Textures Free 4K Textures starting at 5$ per Month

Screenshot of textures.com Website

textures.com has by far the biggest selection of Textures and gets used by the biggest studios around the globe. They are also offering not only textures but also 3d scans and objects. But you need to have a completely free Account. But to download textures in bigger resolutions you need a Premium Account.

Pricing: Free for low resolution Creditpack for high resolution starting at 19$

Screenshot of Poliigon Website

Poliigon is the most modern Ressource for Textures and also offers the highest quality assets with a huge variety of different highresolution and seamless Textures. It also offers own Software for better import and to create own Textures.

Pricing: Starting at 12$

3D Objects

Especially as a professional artist or someone who doesn’t do 3D primary saving time can be important. The best way to achieve this is to just not do the work. So don’t redo things that are already existing. I was never able to understand why people are doing pictures of the Eiffel tower if chances are high that much better photos are already existing on the Internet.

Screenshot of THree D Scans Website

ThreeDScan offers free high-resolution scans of statues. Which are getting used by many 3D Artists. Every 3D model is completely free and without any copyright restriction. Don’t worry if your browser says, that the site is unsafe this is because there is no SSL certificate on the site.

Pricing: Free

Casey Cameron who also created HDRI Haven and Texture Haven has now a new project, where he uploads 3D Models completely free to use, Here are also no limitations or anything since the project runs on donations. The only downside is, that you currently can only find 39 models. But I guess this will change soon.

Pricing: Free

Screenshot of Sketchfab Website

Sketchfab started as a Social Media/ Portfolio Plattform, which enabled people, to upload there 3D Models and Share them with other people. But now it is also a market place to sell 3D models.

Pricing: Free - 290$

Screenshot of Blendswap Website

This one is only for Blender users. But it is awesome since you can not only download 3D Models you can download entirely .blend files. Which is something, that can be really helpful for beginners, who want to reengineer stuff.

Pricing: Free

HDRIs

HDRIs or Spherical images are a possibility to add lightning to a scene quickly or add some extra detail for more realism to the lightning.

Screenshot of HDRI Haven Website

This one is only for Blender users. But it is awesome since you can not only download 3D Models you can download entirely .blend files. Which is something, that can be really helpful for beginners, who want to reengineer stuff.

Pricing: Free

Screenshot of HDR Maps Website

HDRMaps is the Website with the most HDRIs. They even got used in third-party applications like Substance Painter. and they also offer a lot of freebies.

Pricing: Free and paid products

Pixar also started to create some HDRIs for their Renderman users but of cause you can use them with any Software.

Pricing: Free

screenshot of hdri skies website

HDRI Skies is specialized in Skys which means they offer the biggest variety of Skys available.

Pricing: Low resolution free high resolution 15$

Related posts


Export animations in Blender

Export animations in Blender

Edited 11th December 2020

I think, exporting a video out of Blender can be a bit confusing for beginners. It also took me a while to come to this workflow I am currently using so perhaps not only beginners but also more advanced users can learn from this.

I am rendering in single frames and in a linear format. Because I want to be able to pause the rendering process and be as flexible as possible with my rendering when it comes to post-processing.

Step 1. Bake your simulations

In Blender, there are multiple types of simulations. Make sure, that you bake every single one of them, which saves the simulation even if you open and close the project. This is not only a time-saver because you don’t have to run the simulation process again. It also doesn’t create the same result every time which would mean, that if you want to pause rendering and start it again it would create a glitch in the simulation.
To achieve this you just have to click on the Bake Button in each Cache Menu.

Screenshot of Blender baking Cache

Step 2. Activate the Denoiser

In Blender Version, 2.81 Blender got the first AI Denoiser which really delivers remarkable results. You can read more about it in this article.

Comparison of a denoised by Intel Open Image Denoise and a image without denoiser

Since Version 2.82 There is another AI Denoiser by Nvidia which was developed for real-time Ray-Tracing on Nvidia’s RTX GPUs. But even if it is now possible, to use it without an RTX GPU it is much slower than using the IOID in the Compositor which means if you don’t own an RTX GPU you should do it how I described it here.

If you use a Denoiser you can use fewer samples because you don’t have to worry about noise.  Which can lower your Export times dramatically.

Screenshot of Blender Denoiser options

Step 3. Check everything

Watch if everything you want to render is enabled in the Outline Tab, check if the resolutions you set are correct, and make a Test render, to make sure, you don’t forget anything.

Step 4. set your Output settings

Create a folder where you want to have your files exported in because you will have a single Image for every Frame.
And Chose OpenEXR as File Format. That will not give us the Highest Image Quality. It is also a linear Format without any Color space, Which gives you the most possibilities when it comes to choosing the Color space and post-processing without losing any quality.

Screenshot of Blender render Output settings

Step 5. Start the rendering process

Before you start the rendering process don’t forget to save. Also, don’t worry if you open up one of the Images it will look a bit weird which is caused by your OS don’t know what it is and how to work with it.

Screenshot of Blender clicking on Render Animation

Step 6. Import the Images in Davinci

Davinci Resolve is the standard Hollywood color grading software. But it is also possible to edit with Davinci and the standard version is completely free and doesn’t have any limitations at the basic functions. It is also possible, to edit inside of Blender currently this is not recommendable.
To import the images just create a new project and click on File> Import File> Import Media, then navigate to the Folder, where you exported the Images in and select all of them.
And then just drag the footage on the timeline.

The images don’t contain any Information about the Framerate which is the reason you have to set the Framerate manually if it is not 24. You can do that by right-clicking on the Footage and then click on Clip Attributes and change the Framerate.

Step 7. Change the Colorspace

A Benefit of exporting everything in a linear Color space is, that you can switch the Color space after the rendering process. A reason for this could be more realism from the filmic Color space or more color accuracy from the standard Color space.

To change the Color space you have to switch to the Fusion Tab which is something like the Compositor in Blender but much more resilient and advanced.

Then add an OCIC Color space Node by right-clicking > Add Tools > Color > OCIO Color space.
Hold down Shift and place the Node between both of the other Nodes.

In the Inspector Panel on the right side, you have the settings for each Node.
For the OCIO Color space Node load the OCIO Config file of Blender. Just click on Browse and navigate to the config.ocio file. You can find it in the installation folder in datafiles > color management config.ocio.

After this set the Source Space to linear and the Output space to which color space do you prefer. For me, this is filmic sRGB or sRGB in most cases.

Step 8. Export

What you do next depends on you. You can post-process it and color grade it or just export it.

Related posts


Comparison of a denoised by Intel Open Image Denoise and a image without denoiser

Blender 2.81 new denoiser (IOID)

Blender 2.81 new denoiser (IOID) a real Gamechanger

Published 24th October 2019

What is IOID

IOID is an Open Source Denoiser for Raytracing developed by Intel and is currently one of the best if not the best on the market. Blender Version 2.81 implemented this fantastic piece os Software which enables us, to use much lower sampling rates to get a noise-free result without any artifacts which results in much lower render times. If you want to learn more about IOID in general just click here.

How to use the IOID

Download Blender 2.81

Since Blender 2.81 is not officialy released by now you have the download it from the Daily Builds section of blender.org.

The Beta Versions are stable most of the time, but dont forget to save it first so that the autosave feature can start working.

Setting up the View Layer properties

Before you start rendering you have set up the setting in the view Layer properties correctly. For this you have to tick the box for Denoising Data and dont forget to uncheck the Box for Denoise if you have opend up an older project.

After this simply hit render.

Densoie the Image in the Compositor

To finally Denoise the Image, you have to open up a new Compositor Window and add a Denoise Node.

Then connect the Noisy Image output of the Render Layers Node with the Image input of the Denoise Node, The Denoising Normal output with the Normal input of the Denoising Node and the Denoising Albedo output with the Albedo input of the Denoising Node. And the simply connect the Image output of the Denoising Node to the Image input of the Compositor Node as shown in the image. In my case, I also added a Viewer Node so that I can use the Backdrop feature and a Rerouter cause I like it if everything looks organized.

The Comparison

You can click or Tab on it to open the Image in a Lightbox.

Related posts


Tips for more realistic Renderings in Blender

10+1 Tips to get more realistic renderings in Blender.

Updated 16th December 2020

In this article, I want to give you some Tips at the hand to get more realistic Renderings and also explain why you should use them and what they actually do, to give you some deeper knowledge.

Everything here can be applied to CG in general, but it is written specifically for Blender.

1.Use HDRIs!

HDRIs or Spherical Images how they got called are Images that get wrapped around a 3D scene to light the Scene. The benefit of using HDRI lighting is, that it brings a lot of details with it which is the reason, it makes the render much more realistic and on top, it is pretty easy to set up.

I usually supplement the HDRI with Emissionplanes or normal Lights to shape the lightning the way I want it.

Since HDRI means “High Dynamic Range Image” Images with a higher Range of Dynamic are used for this which is recommended because it makes a huge difference also if you don’t export your images in a High Dynamic Range. You can recognize these images by their format. Formats like Radiance HDR, OpenEXR, TIFF and some other formats can contain HDR Images while Formats like Jpeg or PNG can not.

If you want to know, where you can download HDRIs I can recommend my article about the Best Ressources for 3D Artists.

To everyone who wants to create Studio light, I recommend the Blender Studio Light Addon. It is a free Addon that allows you, to create your HDRIs within Blender by using images of real Lamps which like normal HDRIs also give you the advantage of the extra detail to create a more realistic Renderings.

2.Bokeh

little girl on a hourse of a carousel 3d animated

By using the depth of field feature which you can find in the camera menu you can create Bokeh which not only makes your Image more realistic since it emulates something which is happening in every Camera.

It also gives you the possibility, to hide things you don’t want to show like parts with fewer details or seams in the texture. because let’s face it you cant create everything with high details because it would eat up way to much time in some cases.

Another benefit of using depth of field is that it enables you to guide the viewer’s intentions which is something to make the image more beautiful on a subconscious level.

3.Bevels

Blender 3D Screenshot creating Bevel on a Cylinder

If you look around you, no matter where you are you will see, that there aren’t many objects with sharp edges around you. There are some exceptions like sharp or really thin objects. But these are just exceptions. And making something look realistic means making it look like it would in reality.

Also If you create some more abstract motion graphics this small light Edge can enhance the final result.

Since Blender version 2.8 you can also create a Bevel inside the Shader by using the “Bevel” Node. The Samples equal the resolution and the Radius the since of the Bevel. The benefit of this method is, that it uses less computing resources but when it comes to quality a Bevel made out of Mesh still looks better.

4.Motion-Blur

Motion Blur is something, which is in every Photo or Video sometimes it is so little that it is unvisible or only gets recognized subconscious but technically speaking it is there all the time. Which is the reason I recommend to use it if realism is your goal?

In Blender 2.8 you just have to tick the Box in the Motion Blur Menue which you can find in the Render Settings Tab.

The default options should be right most of the time but I still explain the settings to you.

Shutter

The Motion Blur is a build-up of two Effects the first one is called Shutter. Which in real life happens because the Camera captures the light through a small period of time. usually the half-length of a Frame. This means if you filming with 24 Frames per second the period of time in which the light hits the sensor is 1/48 second. with 25 Frames it is 1/50 second with 30 frames it is 1/60 and so on.

The longer the time is the light can hit the sensor the more Motion Blur you will get and vice versa. You can set the amount of time the light hits the sensor per Frame with the Shutter Option. Which is by default set to .5 which means how I mentioned the half-length of a Frame (24FPS 1/48s, 25FPS 1/50s, 30FPS 1/60s…) Which is the setting most cameras also have by default. You can also change this to 1 for Example which would mean the Light would hit the sensor the complete Length of the Frame which means 24FPS 1/24s, 25FPS 1/25s, 30FPS 1/30s…

You can use this if you want to combine Video and CGI and you had another Setting for the Shutter in your camera but most of the time .5 will give you the most aesthetically pleasing results.

Rolling Shutter

The other Effect is The Rolling Shutter. The reason, that this happens in real life is that the camera captures the Images in Lines from Top to Bottom. This happens quickly but not at the same time which is the reason you can see it when it comes to fast-moving objects like cars or planes. It also happens with fast camera swivels.

In Blender, you can activate it by selecting Top-Bottom for The Rolling Shutter which is currently the only option. You can set the strength with the Rolling Shutter duration but also here the default settings will do their job most of the time.

Also if Rolling Shutter gives you more realism you should ask yourself if you want to use it because it is less aesthetical appealing compared to the normal Shutter Effect.

Important

Don’t forget, that Motion-Blur only exists If Motion is in the Scene. If you want to Shot a still you can also add some simple movements that would make a lot of sense if you want to shot things that are moving like planes, cars or animals. But also in Scenes, you can make the camera shake a bit for example. To add that extra bit of realism.

5. Refference Images

Since we try to remake something whats already existing at least until a certain degree it makes sense to orient somewhere. Also If you know how an Object looks it always makes sense to get reference images because most of the time it is a bit different, then we remember.

Getting close to a real object can make a big difference because to recognize something the basic shape often is enough for our brain to reconstruct the rest.

To organize your reference Images there is a helpful and free Software called PureRef. At this point, it also can be helpful to have a second screen.

6.Imperfections

Bare Metal with Scratches

In reality, nothing is perfect every surface has unevenness scratches dirt and so on. Which is the reason you should use textures all the time? You could also use Reference images for this or just think about, how would something looks if it gets used which surface get dirtier and which gets less dirty. Where will be Scratches and where you can find dust after a while. Important is that it makes sense because then it looks real and doesn’t overdo it that’s a huge mistake beginners tend to make.

7.Remove your Noise

Image with alot of Noise

The type of noise that you can see if you render with a too low sample count doesn’t exist in real life or any camera it is a different Type of Noise. Which is the reason you should delete the Noise completely.

Since Blender Version 2.83 we now have in the Render properties in the Samplings panel the option Denoising which makes it easier than ever to denoise renderings. You can even denoise the viewport. You can choose between NLM, OptiX, and OpenImageDenoise if you have an Nvidia RTX GPU you should use the OptiX denoiser because it’s extremely fast. If you don’t have an RTX GPU you should use the OpenImageDenoiser ideally, to get the best performance you shouldn’t use this option and instead use the Denoise Node in the Compositor as described in this Article.

If you want to have Filmgarin I would recommend adding this in Post because how I already mentioned it is a different kind of Noise a looks different. especially in videos.

8.Enough Lightbounces

Screenshot of Blender 3D Software showing a problem wit Light Bounces

In reality, light bounces around a couple of thousand times. The problem is, that this would eat up too much computing power which is the reason, you can set the number of light bounces in Blender. By default, the Maximum bounces are set to 12 which is enough most of the time. But sometimes it makes a difference if you use a higher value.

In the example picture, you can see it clearly. The light only comes through the first three plates of Glass. The reason is, I limited the Maximum Lightbounces to 4. But also if it is most noticeable with Glass it can make a difference with any other Material especially in more complex Scenes.

You can change the Maximum Amount of Lightbounces in The Lighpaths menu in the render settings.

Screenshot of Blender 3D Software Showing Lightpath Settings

9. Not only use Principled Shaders

Also if the Principled Shader can be really usefull and enables beginners to have more realistic materials by default the possibilitys of the Principled Shader are also limited. And I have the feeeling, that since the Principled Shaders got introduced some people are sometimes thinking this is enough.

So I want to encourage you to take the time and create nice and complex Shaders. A Good examples for someon who does this is String Fairy.

10. Do post processing!

Post Processing is something which not only can make your rendering look better it can also can make your images more realistic by emulating effects, that the cameras are ding like lens distortion, adding Noise, Vignette and so on.

11. Try other Render Engines

Cycles is good in most scenarios but there are Scenarios in which Cycles perform really badly. The best example is scenes with a lot of Glass, water, or similar materials because Cycles is bad at creating caustics (which is the bright point on the right that you can see in the image above, that I rendered with LuxCore) compared To the LuxCore Render which is probably the best Blender compatible Render Engine for realism.

I summarized every Render Engine in the Article.

Related posts


Frame of Agent 327 Shortmovie

Why I still use Blender

Why I still use Blender for multimillion-dollar projects.

Rewritten 9th January 2021

The reason I write this Article.

I don’t know if I am the only person doing this, but sometimes I wake up thinking well, maybe Blender isn’t the right thing for me anymore. Then I browse the web, watching at the features of Maya, Arnold, and others for hours. but in the end, I always come to the conclusion, that Blender is still the right choice for me.

And I think that I am not the only one who sometimes thinks like this. Not only professionals, but especially beginners who want to become better and think about maybe working as a 3D artist one day can be insecure when it comes to this topic.

For people who are insecure about this topic or beginners who ask themselves self is Blender really the right software to start with and especially to continue with I wrote this article. So, I list up the reasons, why Blender is the best 3D Software for me personally even for “Pros”.

Blender is free

couple on motorcycle

That it is a free piece of Software was the reason, I used it back then. Because as a student I do not really want to spend €3000 which was the price of C4D back then or Maya which had a similar price tag. And it is also the reason, many other people start using Blender instead of other 3D Software.

But even now as it became a job it is a big reason, I prefer Blender over the rest.

It makes it way easier to not Log in or Register every time you update it or install it on another PC and then something does not work or in a Company where I must wait for an Administrator who has to buy licenses and for that he must make a request which needs to get signed by the COO which takes 2 weeks.

And, on the other hand, I am not willing to give Maxon €60 per month If I can avoid it. But maybe that’s just because I am from Swabia ;p.

Cycles

Even if we as Blender users have some options by now with LuxCore and Octane. And with Redshift and Renderman within the next months. (By the way, I have made a list with every Render Engine which is available for Blender here.) Cycles is still my favorite one.

The reason is, that it is so well integrated into Blender better than Arnold in Maya or Redshift in C4D. It just does not feel like a separate piece of Software. It is also seamlessly compatible with Eevee. And has more possibilities for procedural shading than any other Render Engine I have used so far.

It also runs on every device (AMD GPU, Nvidia GPU, and CPU) Octane for example only runs on Nvidia GPUs.

Compared to Redshift it is a bit slower but instead of investing €500 into A Redshift, you could buy an RTX 30X0  GPU which will probably enhance your Performance more.

System Settings Blender
You don’t need Redshift or Octane for good performance.

The Community

There is no 3D Software out there with as much Content and Tutorials out there such as Blender. The Community is also way more wholesome. When it comes to Autodesk Communities it feels like everyone has a commercial afterthought and tries to sell something.

Here you can find an overview of every popular Blender Community.

Most Add-ons

Blender has an Add-on for everything. There are so many things you never thought you will need in 3D and in most cases later I found out that there was an Add-on that would help me, to get something done much faster.

“The most successful men work smart, not hard”

Bangambiki Habyarimana

The only problem, is, that the coolest Add-ons such as Massive for example aren’t compatible with Blender. But that’s not relevant for me since I never worked in a big Studio.

It is an all-in-One Solution

Blender sculpting
Blender Compositing
Blender Texture painting

With Texture Painting very good sculpting tools and some possibilities for post-processing, it is a wonderful all in one solution. Sure, for each category there is a better tool also if that is very controversial when it comes to sculpting. But just having everything in one Software just makes it easier and less time-consuming.

So much progress in developing

Compared to other software like c4d which publishes one major version every year it feels like Blender publishes interesting new features almost every day. Which keeps me interested in further development.

Unique features

Blender has many features that are only available in Blender. Which can be really important in certain situations.

action Figure with parachute

For example, last year I had a project for which we motion tracked a dancing musician. And I had to modify his motion. So, I used the Simplify F-Curve feature in the Graph Editor, to make his movements editable.

Related posts


rocket flying infront of blue sky

8 Tipps - How to render faster in Blender

8 Tipps - How to render faster in Blender

Edited 18th December 2020

In this article I show you 8 Tips on how to Render Faster in Blender.
If you think I forgot something write me a Message. I am always happy to learn something new.

01. Use GPU + CPU

Since Blender Version 2.8 you can use your GPU and CPU simultaneously. Which can give you a huge advantage especially if you own a good CPU like a Threadripper or a Xeon CPU.

If you have an Nvidia RTX GPU you can also switch from CUDA to OptiX which will also increase your render performance by 20%-30%.

02. Denoiser

Since Blender version 2.82 we have now 3 Denoisers.
The default one from Blender itself. which is also the oldest one, since it exists since version 2.79.
Then there is the IOID Denoiser from Intel which works with artificial intelligence and delivers remarkable results as you can see in my article “Blender 2.81 new denoiser (IOID) a real Gamechanger”.

And then there is also the Optix denoiser from Nvidia, which works only with their RTX GPUs. It is the same denoiser that’s used by every Game with Raytracing and enables denoising right inside the Viewport.

Using a Denoiser allows you to use fewer samples by achieving the same result.

03. Keep it simple

Try to avoid as many polygons as possible and delete unnecessary objects in the scene or hide them.

A new feature of Blender 2.8 which can help is the Cycles bevel node. Which enables you, to create bevels based on a normal map. But the problem here is, that it only works for very small bevels. On bigger bevels, it doesn’t look good.

04. Light Paths

The Lightpaths simply set, how often the light bounces from one to another object. Technically more light bounces mean more realism since in reality, the number of bounces is close to infinity especially with glass or in very complex scenes this is the case but most of the times there is no real difference between many or just a few light bounces speaking about the End result. For a simple scene Total: 5 Light bounces should make your scene render faster and don’t change the result.

But here it is the same as with the Denoiser its all about experimenting and getting a feeling for it how it affects your rendering.

05. Principled Shader

The principled Shader not only brings more realism by default it also brings optimized performance over a recreation of the same Shader. In Blender 2.8 we got Principled hair and Principled Volume beside Principled BSDF so use them they are awesome. What matters is the final result and not the afford you put in by creating everything by yourself.

06: Use the latest Blender Version.

Roughly every 3 months Blender publishes a new Major version of Blender, Which not only means new features, (which also can improve performance) It also comes with a lot of performance improvements.

If you want to stay up to date with the latest version of Blender you can install it over Steam for the nongamers of you Steam is the biggest online Store for Videogames but it also has other Software like Blender in their range.

release plan for Blender

07: Use another render Engine

Cycles is not the only Render Engine out there. It is also not the only render Engine for Blender or the only free Render Engine for Blender.
In some cases, it can make sense to use another one, to get better performance or better results. I wrote a whole article about this topic where I listed Every Render Engine for Blender.

glass brick with caustics

08: Adaptive Sampling

Since Blender Version 2.83 there is the Adaptive Sampling feature, which allows Blender to use different sampling rates in the same image.

That means more noisy parts of the Image that require more samples get more samples but Blender doesn’t waste time at parts of the image which doesn’t require as many samples. Which can make it faster multiple times but also not faster at all. It hardly depends on the Image.

Just check the Box at Adaptive Sampling and you are done. you could also add value for minimum samples or mess with the Noise Threshold but I just leave it how it is.

Related posts