• Which the release of FS2020 we see an explosition of activity on the forun and of course we are very happy to see this. But having all questions about FS2020 in one forum becomes a bit messy. So therefore we would like to ask you all to use the following guidelines when posting your questions:

    • Tag FS2020 specific questions with the MSFS2020 tag.
    • Questions about making 3D assets can be posted in the 3D asset design forum. Either post them in the subforum of the modelling tool you use or in the general forum if they are general.
    • Questions about aircraft design can be posted in the Aircraft design forum
    • Questions about airport design can be posted in the FS2020 airport design forum. Once airport development tools have been updated for FS2020 you can post tool speciifc questions in the subforums of those tools as well of course.
    • Questions about terrain design can be posted in the FS2020 terrain design forum.
    • Questions about SimConnect can be posted in the SimConnect forum.

    Any other question that is not specific to an aspect of development or tool can be posted in the General chat forum.

    By following these guidelines we make sure that the forums remain easy to read for everybody and also that the right people can find your post to answer it.

Blender Normal map lighting issues

The compiler or DirectX has no way of knowing your intent such as if it were already flipped beforehand. Why does anyone do anything the way they do in computers, well people don't always think things through. There are a million things I'd change about how the SDK works for instance, but we have to live with the weird way some programmer's brains work. Sometimes they are just young and inexperienced, some are on the autistic spectrum and are not good at multitrack thinking, so they get too focused on a single item. Companies sometimes intentionally want to change things just to make it more proprietary, but that is a story for another forum, or a different thread at least.

I don't think anyone is mixing up rotating the UV with inverting the green channel of a normal map. The 2 dimensional doesn't have a z-axis, but I usually hit R-Z beforehand as it seems to avoid some Blender rotation issue where it sometimes miswraps the texture when I use the rotate function by itself in the UV (or as R-X or R-Y). It is probably a hotkey mode issue that I sometimes get locked into, and I can probably hit any key to reset the issue (but z makes sense and easy to remember as hitting the key nullifies the rotational mode), and that's just what I know works every time for me in blender in the UV editor.

Actually, some seamless textures are virtually identical in the rotation and it's hard to tell if something is slightly off-rotation, I've done it many times where I didn't realize it for over 15 minutes was trying to align a texture, then I was like oh yah it's backwards. Depends on the texture, sure unique or completely non-symmetrical textures are obvious when the orientation is off, but with some textures you can barely tell at all, but it will change the shading. So I stand by my original points...
 
Last edited:

Pyscen

Resource contributor
OK I'm seriously confused now. First, some people here seem to mix up flipping or rotating the normals of the Mesh faces in Blender with flipping or rotating the normal map of the texture. Second, some people claim that you need to invert the y axis of the normal map while others talk about inverting the uv map. This would have totally different effects. If I flip, rotate or however manipulate the Uv map this will change how my whole texture is applied, it will be upside down, sideways, whatever. If I just flip the normal map in Photoshop or gimp only the normal map will be different while the albedo and metallic texture will keep their orientation. I remember that in prepar3d you also had to flip the y axis of your whole texture (which had the same effect as flipping the uv map would have in Blender) before exporting. Model Converter X even had an option for this in the material editor. I just wonder why the exporter tool does not automatically take care of this. If direct x interprets the y orientation differently, why not automatically flip the y axis when converting from png to dds? Furthermore, are there any modifications the msfs toolkit applys to the textures when you export your blender model to gltf?
The only thing I was speaking about is this:

If you are using the stand-alone version of the NVidia Texture Tool (or any other OpenGL version application in converting normal maps), which uses OpenGL and not DirectX, then you will need to flip the Y-axis on your normal maps to be compatible with MSFS. This is true for all Flight Sims [including FSX (FSX: SE), P3Dv1 through P3Dv5)]. Why, Because the difference between OpenGL and DirectX is the Y-axis. There aren't any other changes needed or required by the designer with the use of the SDK for MSFS or FSX, that I'm aware of.

MCX uses DirectX in creating the Normal maps. Therefore, you do not need to do anything with the Y-axis at all. Saying this doesn't mean though that you may or may not need to flip the X-axis, which will vary or depend on your albedo/diffuse maps.
 
I've got over 100 objects at my airport exported from Blender now (so I think I can safely say I've done quite a bit of this and I'm speaking from experience even if I didn't exactly explain everything as some wanted), but there are definitely a lot more issues than just the Y issue, that is the easiest one to get around and the least confusing. The others are much harder to diagnose, especially having to check the DDS output for issues.

There is a filelock bug that sometimes locks the DDS in a cache, and it will actually output a separate object's normal over another one. Luckily this is an extremely rare bug. You'll notice when this happens by seeing a line in your texture. There are just too many issues to list, most of these issues did not exist in Xplane, though there were a few there. There are also color mapping conversion issues with some normals, and you should be careful with normals in general, make sure it looks as expected.

There are other issues that I cannot explain why they occurred, hence I can only surmise a best guess.
 
The compiler or DirectX has no way of knowing your intent such as if it were already flipped beforehand. Why does anyone do anything the way they do in computers, well people don't always think things through. There are a million things I'd change about how the SDK works for instance, but we have to live with the weird way some programmer's brains work. Sometimes they are just young and inexperienced, some are on the autistic spectrum and are not good at multitrack thinking, so they get too focused on a single item. Companies sometimes intentionally want to change things just to make it more proprietary, but that is a story for another forum, or a different thread at least.

I don't think anyone is mixing up rotating the UV with inverting the green channel of a normal map. The 2 dimensional doesn't have a z-axis, but I usually hit R-Z beforehand as it seems to avoid some Blender rotation issue where it sometimes miswraps the texture when I use the rotate function by itself in the UV. It is probably a hotkey mode issue that I sometimes get lccked into, and I can probably hit any key to reset the issue (but z makes sense and easy to remember), and that's just what I know works every time for me in blender in the UV editor.

Actually, some seamless textures are virtually identical in the rotation and it's hard to tell if something is slightly off-rotation, I've done it many times where I didn't realize it for over 15 minutes was trying to align a texture, then I was like oh yah it's backwards. Depends on the texture, sure unique or completely non-symmetrical textures are obvious when the orientation is off, but with some textures you can barely tell at all, but it will change the shading. So I stand by my original points...

I've got over 100 objects at my airport exported from Blender now (so I think I can safely say I've done quite a bit of this and I'm speaking from experience even if I didn't exactly explain everything as some wanted), but there are definitely a lot more issues than just the Y issue, that is the easiest one to get around and the least confusing. The others are much harder to diagnose, especially having to check the DDS output for issues.

There is a filelock bug that sometimes locks the DDS in a cache, and it will actually output a separate object's normal over another one. You'll notice when this happens by seeing a line in your texture. There are just too many issues to list, most of these issues did not exist in Xplane, though there were a few there. There are also color mapping conversion issues with some normals, and you should be careful with normals in general, make sure it looks as expected.
I'm sure you know what you're doing, much more than I do, it's just that the more I read about this stuff the more confused I am about how to correctly apply and convert my materials and how to correctly export my models into the Sim.
 
I'm sure you know what you're doing, much more than I do, it's just that the more I read about this stuff the more confused I am about how to correctly apply and convert my materials and how to correctly export my models into the Sim.
I really don't consider myself an expert at all, I am just fumbling around like most, my only point was that I've seen a LOT of weird issues cause I have exported a lot of objects, some that I never could explain.

The same "flip the Y" solution is often repeated over and over again in here, but many times the issues are much more complex and are related to the way the lighting and the comp textures are interacting. Besides, you can always try the normal flipped either way, and also the way the normal looks will matter as much or in some cases more. If the normal is flat, flipping it will have almost no effect anyhow. I suggest when having problems, to get rid of the normal, and revert the Blender save, delete the GLTF and XML files, and start the process over again from the beginning of the workflow, hence go back to materialize or whatever your workflow is and redo it.
 
Maybe the Best way is to keep it simple printed card stock haha! When 8 checked the normals (in blender) on one of my walls, it was so dark on the bottom, I found the lower half of the wall had the normals arrows/inducators point to the back of the face, while others pointed forward, mid wall An the top again pointed the wrong way. I built this wall as a flattened cube and removed faces from the back, top and bottom. I am thinking either the fact I deleted them separately or in the wrong order caused issues. Kind of like doing modifiers in the correct order, or when you shade smooth the object and it gets a gradiation look that has the thickness getting thinner at corners and you have to correct this in the settings of the modifier. So my thought is, if you build a plane and just extrude it’s edges (or duplicate vertex groups in edit mode and extend) to make more walls perhaps that will keep everything facing the right direction. The more I dive into Blender the more complex I find it! Kudos to the devs of Blender and Blender2msfs, that’s a ton of work and not to forget the “experts” in these softwares who share thier findings and experience. Wherein we would be dead in the water w /o them!
 
Top