The problem of detecting rough measurements (outliers) during automated processing of measurement data series received from technical devices is considered. A modification of the strategy for detecting outliers in time series of noisy data containing an unknown trend, developed earlier by the first of the authors, is proposed. The previously developed strategy consists of two stages: building a trend and applying the algorithm for finding the optimal solution to the residuals obtained after subtracting the found trend from the measurement data. The search for a trend is carried out in the class of power polynomials using the least squares method on sets of reference values, the number of which is specified in advance. The trend search algorithm is carried out using an absolutely convergent iterative process and is based on the method of minimizing sequences (sets). When implementing the previously developed strategy, it is necessary to set the total number of reference values on which the trend is built, which can distort the definition of the trend and the detection of outliers. The proposed strategy corrects this shortcoming: the number of reference values is selected from the condition of minimizing the number of detected outliers, on the one hand, and from the condition of maximizing the number of reference values, on the other. The results of numerical testing on real data for satellite laser ranging measurements are presented. The developed strategy can be used to detect and eliminate outliers from the time series of measurement data at the stage of their pre-processing.