Unity raycast with different behavior between editor and build

I am trying to build a curved mesh in Unity. The idea is to put all of the GameObject on a sphere and show the sphere from the inside.

I did it without any problems with a render texture (1500*1500) generated by a camera on a canvas. It works as expected.

However, as you can guess, the events are not working since the canvas creating the render texture is the one handling the events, not the texture itself.

To fix this issue, I created a script that translates the coordinates on the sphere in coordinates in the canvas and send the event to the right GameObject using a raycaster.

eventSystem.RaycastAll(computedPosition, raycastList);
var raycast = FindFirstRaycast(raycastList);

This code allows me to propagate the event to the GameObject on the canvas. And it works perfectly.

Everything works exactly as I want when I start the game in the editor. But, and I can't understand why, when I build the project and start the game, it doesn't work.

I made a few comparisons between the built game and the game in the editor:

  • The positions of all the components are exactly the same
  • The computed positions on the canvas are the same, regardless of the resolution
  • In the built game, all the components with a computed Y value higher than 1080 are working fine
  • In the built game, all the components with a computed Y value lower than 1080 are not working

When I say "not working", I mean that this function

eventSystem.RaycastAll(computedEvent, raycastList);

is returning an empty list. The raycast can't find any game object in build mode while it was able to do so in editor mode.

My first guess was unity removes the elements from the screen, if they look useless, for optimizations. But it doesn't seem to be the case.

My question is why is there a different behavior? Did I miss something? I suppose the 1080 value is quite important, but I can't find any relationship with it.