A Monte Carlo approach is presented for the problem of least squares fitting of points with correlated errors. The method consists of the definition of an uncertainity space based on absolute error and random sampling for each data point. This procedure is repeated a large number of times (typically 1000 times) and a regression is obtained for each set of data. The results are stored and described statistically at the end of the process. Comparisons were made with results obtained by conventional methods applied to Rb/Sr isochrons, without any significant discrepancies. The main advantage of the method now proposed is its simplicity and the possibility of graphical representation of variability