Iteration over small samples and recursive improvement
I wasted so much time aiming for a goal and looking at the onset for a solution that works over either a long period of time or a large sample, or both. In both case you have a to wait a long time before you can assess whether you have found a solution or not, and it's improbable that you stumble on the right solution on the first try.
It's good to have an overall direction, but better address the problem in your face today. And the way to do that is by iterating over a small sample, and recursive improvement.
Iterating over a small sample
Addressing the problem in your face today means you don't wait months, or wait for a hundred of observations to evaluate if what you're doing works. Think the simplest version of your idea you can test today or this week. This way you quickly assess if you're onto something. Ten is a good sample size. As in ten people, ten emails, ten trades.
Iterating also means you don't throw an idea away right away when it doesn't work. Look for what can be tweaked to improve your first version. Favor small changes over starting over. That is the essence of iteration.
Big and long stifles iteration.
Small and fast fuels it.
Recursive improvement
Once you've got something going, instead of just adding new things, reuse what you've already got to improve it. That's the principle of the growing snowball. It's not just snow that keeps being added from outside. It's a little chunk of snow that becomes a ball, then it can roll, and by rolling it collects more snow.
Think of IA used to improve IA.
That's how nature works. It takes a single cell then use it to create two, then four, etc. and keeps reusing what has just been created along the way to keep improving.
So if you built or found something that solves the problem in your face today, use it to come up with an improved solution that solves a bigger [more ambitious] version of the problem.