Developing robust and less complex models capable of coping with environment volatility is the quest of every data mining project. This study attempts to establish heuristics for investigating the impact of noise in instance attributes data on learning model volatility. In addition, an alternative method for determining attribute importance and feature ranking, based on attribute sensitivity to noise is introduced. We present empirical analysis of the effect of attribute noise on model performance and how it impacts the overall learning process. Datasets drawn from different domains including Medicine, CRM, and security are employed by the study. Using proposed technique has practical implications by supporting building low volatile, high performance predictive models prior to production deployment. Also the study has implications for research by filling the gap in attribute noise research and its impact.