In order to implement an effective iron scale mitigation strategy, operators first need to identify the main source of iron in the system. Establishing the main source of Fe2+ in higher temperature sour wells where iron sulphide and iron carbonate are formed can be problematic. Many fields do not have reliable formation water composition and the analysis of produced water often does not allow us to draw clear conclusions when corrosion and scale formation occurs in the well. This work describes a method to predict the "maximum dissolved iron" (MDI) concentration in a reservoir/production system. The MDI is the amount of dissolved iron potentially present at equilibrium either in a reservoir or in the production system; for example, we will show that the MDI can be quite different in a carbonate reservoir and in the production system. Using this concept (MDI), we aim to identify if iron deposits may be formed from naturally occurring Fe2+ (present in formation fluids) or solely from corrosion processes and/or external sources. The results presented in this paper include a sensitivity study on the effects of dissolved CO2 and H2S concentrations, temperature, calcium levels and salinity on the MDI. Finally, results from two field cases are presented where one is a sour gas/condensate field and the other is a black oil reservoir.