The classical LSE (least squares estimator) of the regression parameter in a simple regression model, MLE for normally distributed errors, is known to be highly nonrobust for outliers (gross-error contamination) and plausible departures from the assumed normality. The estimator based on the Kendall tau-statistic, known as the Theil- Sen estimator (Sen 1968 JASA), is robust, median-unbiased and insensitive to nonnormality of errors. In this simple regression setup, it is tacitly assumed that the regressors are nonstochastic. In many situations, not only the regressors are stochastic but are also subject to superimposed measurement errors. This scenario is appraised in a simple measurement error model:
$$ Y_i = \alpha + \beta X_i + e_i, W_i = X_i + U_i, i= 1,...,n,$$
\(X_i\) are i.i.d.r.v.s with a distribution with location
\(\mu_x\), and the errors
\(e_i, U_i\) and
\(V_i = X_i - \mu_x, i = 1,...,n\) are mutually independent. The usual scenario is to assume that all these errors have normal distributions with 0 means and appropriate variances. We examine this stringent assumption and appraise the scenario beyond conventional normality; the findings are simple and yet interesting.