I use a site called PredictionBook to make predictions, say how likely I think they are to occur, and then later judge whether I was right or wrong. Over many predictions, this improves my calibration on knowing what will happen on the future.
The reason this is super useful is that pretty much all humans are overconfident in their predictions. As one blogger put it:
Nearly everyone is very very very overconfident. We know this from experiments where people answer true/false trivia questions, then are asked to state how confident they are in their answer. If people’s confidence was well-calibrated, someone who said they were 99% confident (ie only 1% chance they’re wrong) would get the question wrong only 1% of the time. In fact, people who say they are 99% confident get the question wrong about 20% of the time.
So if I say I think my team will ship Feature X before March with 80% confidence, I want that to happen 80% of the time, not 55% of the time. If I think we will be 95% likely to hit our sales targets, then I'd like us to miss only 5% of the time. I want to be well-calibrated. This is really important not just for business strategy, but also for day-to-day decisions. If I'm "pretty sure" that I'll have enough time to hit the gym before my meeting, then I don't want to miss my meetings because my "pretty sure" is a human's normally overconfident way of judging a "slightly probable" situation.
But it's hard to realize that when you say something is 99% likely, it only happens 80% of the time, unless you are keep tracking of your predictions. So, I practice using PredictionBook. Over time, I have become less confident at the high end of certainty and more confident at the low end: