- Estimating the relative pose and motion of cooperative satellites using on-board sensors is a challenging problem. When the satellites are noncooperative, the problem becomes even more complicated, as there might be poor a priori information about the motion and structure of the target satellite. In this paper, the mentioned problem is solved by using only visual sensors, which measurements are processed through robust filtering algorithms. Using two cameras mounted on a chaser satellite, the relative state with respect to a target satellite, including the position, attitude, and rotational and translational velocities, is estimated. The new approach employs a stereoscopic vision system for tracking a set of feature points on the target spacecraft. The perspective projection of these points on the two cameras constitutes the observation model of an iterated extended Kalman filter (IEKF) estimation scheme. Using new theoretical results, the information contained in the visual data is quantified using the Fisher information matrix. It is shown that, even in the noncooperative case, there is information that can be extracted pertaining to the relative attitude and target structure. Finally, a method is proposed for rendering the relative motion filtering algorithm robust to uncertainties in the target's inertia tensor. This is accomplished by endowing the IEKF with a maximum a posteriori identification scheme for determining the most probable inertia tensor from several available hypotheses. The performance of the new filtering algorithm is validated by Monte-Carlo simulations. Also a preliminary experimental evaluation is provided.