In this paper, the problem of self-calibration for large astronomical arrays such as the Dutch Low Frequency Array (LOFAR) is considered. We assume direction dependent gain and phase errors which need to be estimated and calibrated out. Combining the subspace fitting and least square approaches, the signal subspace of the received single short-term interval (STI) sample data of the LOFAR is used to build a cost function whose minimizer is a statistically efficient estimator of the unknown parameters-the gains and phases of the telescopes. Subsequently, an iterative algorithm for finding the minimum of the cost function is presented and the unknown calibration parameters of both the core stations and the external subarray are separated. As a result, the computational complexity of the proposed method is significantly reduced compared to the existing methods based on a direct covariance fitting. Finally, the performance of the proposed method is compared with the conventional peeling method in computer simulation. An example for calibrating the core of the LOFAR array on Cyg A is also provided.