It depends how much more data and how much more noise.

Take some model with sqrt(n) convergence, for example. Maybe your error goes like noise/sqrt(n), which means for the same error, the noise vs data tradeoff goes as:

const \* noise' / sqrt(n') = error = const \* noise / sqrt(n)

\--> sqrt(n') / sqrt(n) = noise' / noise

\--> n' / n = (noise' / noise)\^2

So 4x the data for 2x the noise to come out even. If you offer me 10x the data for 2x the noise, I've come out ahead. If you offer me 2x the data for 2x the noise, I've come out behind.

That all assumes a particular convergence rate, of course. YMMV (your model may vary)