In this paper, we propose a global gaze following method using the patched‐based multi‐task multi‐scale reborn network (MMRGaze360) specifically designed for panorama images. Unlike existing approaches that rely on spherical networks or process only local regions, our architecture thoroughly accounts for the distortions introduced by the sphere‐to‐plane projection, enabling gaze following in comprehensive 360‐degree images. MMRGaze360 incorporates field‐of‐view (360‐FoV) and sight line (360‐Gaze) generators to model gaze behaviours and scene information in 360‐degree images. A multi‐task multi‐scale module is introduced to capture features from multiple patches centred around the estimated points located in the 360‐Gaze, using multi‐scale attention maps. These features, along with the 360‐FoV, are fused to produce a final heatmap. Additionally, we employ multi‐layer perceptions and convolutional networks using the reborn mechanism to enhance information usage and feature representation. Moreover, we establish a novel dataset, SRGaze360, which contains more conditions of the sphere‐to‐plane distortion. Experimental results on the GazeFollow360 and SRGaze360 datasets demonstrate the superiority of our method over previous works. It can be validated that our approach effectively addresses the limitations of 2D gaze following in handling out‐of‐frame gaze positions and distortions in 360‐degree images.