360° Environment different in Rendered and Raytraced Display mode

I have created a custom Environment with a spherical projection and I needed to rotate the background image to align it with the scene, by tweaking the offset u value. It shows correctly in the Display Mode Render but in the Display Mode Raytraced and in the Rhino render it shows as if the offset values were the default ones.
Any ideas?

No one can troubleshoot, or work on the issue without a repeatable example.

Please provide all the files and steps to follow to repeat the problem.


Hello John,
Apologies. I assumed it was easy to replicate by the description on the original post.
Create a new model
Create a new Basic Environment.
Add a background image.
Set projection to Spherical.
Click on the image name and change anything on the Mapping settings - for example may the image repeat 20 times.
Go to the rendering menu, set the backdrop to 360° Environment with the environment you just created.
Make view display mode Rendered, you should see the image applied to a environment sphere 20 times.
example Rendered

Make the view display mode Raytraced, you should see the image applies to the environment sphere a single time.
example raytraced

I don’t know how one would do a spherical projection of a regular image in Blender Cycles even. This type of projection isn’t supported at the moment.

Hi @nathanletwory ,
Pardon my ignorance.
I would expect that the background on the environment sphere would work pretty much as a texture on a sphere. There I can tile and mirror the texture and it renders the same way in Rendered and Raytraced Display Modes. It did the trick for what I needed it now, but I would like to understand the issue.


At present there is no shader in Raytraced that is able to do what Rendered can for environments.