When independent variables in multiple regression analysis are highly correlated among themselves, multicollinearity among them is said to exist. Multi-collinearity often results in imprecise estimated regression coefficients. To lessen the problems caused by multicollinearity and thereby reduce the standard errors of the estimated regression coefficients, one or several independent variables may be dropped from the model. This remedial measure may cause originally significant independent variables to be nonsignificant and this phenomena reveals the possibility that intercorrelation among independent variables may have a positive effect on regression analysis. In this paper, we find that, in the situation of two independent variables, if the two independent variables have middle to high correlation, and the product of the correlation coefficients between the response variable and each of the two independent variables has different sign from the coefficient of correlation between these two independent variables, then these two independent variables will jointly have significant contribution in explaining the response variable. We get similar results for the case of k independent variables. The method we suggest has the advantage in choosing variables for regression model when we cannot find any variable by forward stepwise regression procedure or backward elimination approach is unsuitable for small sample size. We give two examples to illustrate the concepts.