I hate to be a downer, but your methodology is very flawed.
First of all, you assume 75 pitch speed is perfect based on a faulty eye test. Meanwhile, you call a 3.3 run time "too slow" when it was merely tenths of a second too slow.
Why do you break one down to fractions of a second while settling for "eh, that looks good enough" for the other?
That said, 75 pitch speed is 1000000% absolultely NOT accurate to real life. It is significantly slower (I'd wager 100 pitch speed is actually slightly slower than real life as well, but I haven't tested this I'm just going based on my own eye test).
The real problem with this error though, is that baserunner speed while stealing will be slowed down the same amount as pitch speed. That is the nature of the animation. Until the pitch gets to homeplate, the entire game speed matches that of the pitch. Put pitch speed at 0 and watch your guys attempt to steal and they'll start in slow motion (before returning to normal speed as soon as the catcher catches the ball).
The error then, is that you are striving for accurate steal times, while having the beginning of the steal animation play out slower than real life. Which means 80 baserunner speed is actually faster than real life, since it is compensating for that slowed down beginning animation.
This is why you see so many people complaining about easy triples/inside the park hr's, etc, because while it may result in accurate steal times, it is doing so by making players significantly faster than real life.
I'm not trying to be a jerk, I love people who attempt to use statistics to generate sliders. My issue is that you make some fundemental errors in the way you go about doing it.
Put pitch speed at 100, then re-time steal attempts, and I think you'd get closer to "real life" runner speed. Even then though, I believe 100 pitch speed is still slightly slower than real life, so you'd have to knock runner speed down 5-10 points from that. But in the end, I think it will give you much more accurate results.
Comment