It's interesting. Nate Silver basically wrote a book about this article, even touching on all the same examples, and drew the exact opposite conclusions. Weather modeling, for example, lends itself well to computation because the phenomena are well understood, but mathematically intensive. Epidemiology is poorly understood mathematically, so entire towns are modeled down to ages, occupations, blood types and hobbies of individual "vectors" (sims, basically). Economic modeling is utterly devoid of real world data, making it mostly nonsensical. Silver's argument is that models should be trusted as far as you can understand them and as far as you can verify them, no further, and that the more you verify their behavior, the more closely they will approximate the actual conditions under which they are tested. Which is to say, push your model out of its verified box and it's nonsense again. He basically argued for statistical determinism - test something enough and you'll have a good idea what your test results are likely to be if nothing else.
I saw a lecture yesterday where the speaker quoted a saying that I hadn't heard before, but that stuck with me: "All models are wrong; some are useful." I'm not entirely sure I agree with the quote, but it at least sheds light on the incompleteness of modeling physical system, which I think is undeniable.
Nate (as always) is the one who is right. Has he got the new 538 up yet? I miss it.