This paper reflects upon the conceptualization of risk management adopted in the field of information systems (IS). We explore the unintentional consequences, uncertain risk, and the systemic risk that we are yet to know. In particular, we examine unexpected failures of human-technology interaction. Conventionally, IS researchers tend to assume risk management as tradeoffs involved in risk-taking, and they rely on various probability and statistical approaches offering a rigorous basis for informing decision. The topic where risks are most widely documented is IS security. Yet, the conceptualization of uncertain risk remains unclear. This paper proposes a reconceputalization of knowledge of risk that incorporates both probability risk and uncertain risk. Drawing on Stirling’s (2007) classification of incomplete knowledge, risk, uncertainty, ambiguity and ignorance, we report our pre-test results from dominant IS literature (2010-2015). We present evidence of our assertion that IS research has yet to deeply engage the notion of uncertain risk. This paper suggests that the precautionary principle drawing from jurisdiction, policy areas allow a deeper and broader understanding of knowledge of risk in the IS field. This understanding provides insight into a more “responsible governance of risk” toward sustainability, avoiding a determinative “tunnel vision” to the potential dangers of future IT development in a broader extent of uncertainty and complexity.