Mental Model for Texture Mapping Node Type "Point ("Industrial Environments with Eevee" Course)

For rotating the fog with "Linear Gradient" density pattern based on "Object Coordinates":

The "Cordinate System" gizmo at the bottom center of the fog wall above shows the orientation of the "Local Coordinate System" of the "Fog Object". The fog shall be transformed so that there's a fog layer floating directly under the garage's ceiling while the space below down to the floor is clear:

This means that the fog must be rotated + 90 degrees around the "Fog Object's  Local Y Axis" (counterclockwise with the "Local Y Axis" pointing at us). Since the "Mapping Node" is set to type "Point" we have to enter the opposite direction of - 90 degrees. The "Fog Object's Local Coordinate System" has to be rotated mentally in the same direction as the texture so that the "Local X Axis" is now pointing downwards relative to the camera's orientation in the screenshots above. For the fog layer to be lifted towards "Negative X" of the new "Local X Axis" we have to change the "Local X Axis" value into the opposite direction of "Positive X".

  • The final fog is looking like this:

    I've added some slight "Emission Strength" of 0.0025 with the green color from the "Area Lights". This alone would aready create a kind of fog in the background:

  • I forgot to mention that the "Transformation Order" for a "Mapping Node" if set to type "Point" is:

    Scale => Rotate => Translate 

    (see here)

    That's why we need to mentally rotate the "Local Coordinates System" of the "Fog Object" in order to find the right settings for the final translation of raising the fog layer.