The paper observes that the peak of the classical U-curve of bias variance tradeoff occurs when the number of parameters is roughly equal to the number of samples.
It is common to use neural networks with extremely large number of parameters. But to achieve interpolation for a single output (regression or two class classification) one expects to need at least as many parameters as there are data points. Moreover, if the prediction problem has more than one output (as in multi-class classification), then the number of parameters needed should be multiplied by the number of outputs. This is indeed the case empirically for neural networks.
For MNIST n(number of samples) = 4000, and d(dimension) = 784, K( number of classes) = 10. The author observed the interpolation threshold (denoted by the dashed line in the above graph) in accordance with the predicted number of parameters (n.K).