Robotic guides take visitors on a tour of a facility. Such robots must always know the position of the visitor for decision-making. Current tracking algorithms largely assume that the person will be nearly always visible. In the robotic guide application, a person’s visibility could be often lost for prolonged periods, especially when the robot is circumventing a corner or making a sharp turn. In such cases, a person cannot quickly come behind the limited field of view rear camera. We propose a new algorithm that can track people for prolonged times under such conditions. The algorithm is benefitted from an application-level heuristic that the person will be nearly always following the robot, which can be used to guess the motion. The proposed work uses a Particle Filter with a ‘follow-the-robot’ motion model for tracking. The tracking is performed in 3D using a monocular camera. Unlike approaches in the literature, the proposed work observes from a moving base that is especially challenging since a rotation of the robot can cause a large sudden change in the position of the human in the image plane that the approaches in the literature would filter out. Tracking in 3D can resolve such errors. The proposed approach is tested for three different indoor scenarios. The results showcase that the approach is significantly better than the baselines including tracking in the image and projecting in 3D, tracking using a randomized (non-social) motion model, tracking using a Kalman Filter and using LSTM for trajectory prediction.