This question describes a method to calculate the number of Monte Carlo simulation runs required. Another method checks the convergence of the mean of a particular output variable. Both of these methods focus on the output variables without regard to the number or variance of the input variables.
In general, will the number of Monte Carlo simulation runs need to increase if the number of varying input variables/parameters is also increased?
Or, does number of required runs only relate to increasing or decreasing variance on the input variables?