These computer models are doing very well, but not yet well enough. The problem is dynamic range: any simulation that models a large enough piece of the universe to be representative (a cube of 100 Mpc on a side seems sufficient), cannot have the resolution to accurately follow what happens on galactic scales ( kpc). At present, each point in the n-body code represents around , and you cannot model star formation and galaxy dynamics with only particles. This means that ad-hoc rules have to be introduced to model the formation of stars and galaxies, eg. you could assume that a galaxy forms whenever the density passes some arbitrary threshold, with a luminosity proportional to its baryonic mass.
This is unfortunate: the computers can accurately simulate things we cannot observe, such as the large-scale distribution of dark matter, but not things that we can, like the galaxy luminosity functions, star formation rates and clustering.
So surely the best thing to spend money on is faster supercomputers! With a network of massively parallel modern supercomputers, the simulations will become so good that they can accurately predict observable quantities without the necessity of making so many ad-hoc approximations. The supercomputers will be welcomed by scientists in many other disciplines, as well as by other astronomers. Forget obtaining more data: a relatively modest investment in computer power will allow us to better understand the data we already have.