DISCLAIMER: Nothing in the following post is a slight, complaint, or judgement of the amazing work being done by roster editors on this site. Without them, I'd be playing with 'QB #3' handing off to 'HB #21' who gets tackled by 'ROLB #43'
A while back, I had compiled 11 seasons worth of recruiting classes to do some
in depth analysis. I dug it out this week and decided to compare true freshmen from the recruiting data to true freshmen of a roster found on OS. Out of 11 seasons, the highest OVR was 86 (only 2 out of 26,837) and the highest in the base game roster was one 84. By comparison, the roster file had 16 out of 8,694 true freshmen with an overall of 86 or higher.
This gave me an idea...what would happen if I were to adjust the updated roster to the original scale used on the default roster. Why you ask? Because I don't want to start a dynasty with players on one scale, to then have those replace over time with a different scale. Yes I'm weird, but that's how I want to do it, if possible.
So how does this work? If a skill ranged from 15-90 in the default roster, but ranged from 20-99 in the update, then I would convert the updated roster to match the 15-90 original roster scale. In this, I am breaking the roster down by player year, redshirt status, and position, so that each group is scaled to just their group. That is a total of 168 groups (4 years x 2 redshirt status x 20 positions), with each having their own max/min for each of the 43 skills. Throwing accuracy for quarterbacks is going to be on a different scale than defensive ends, and true freshmen QB's are going to be on a different range from redshirt senior QB's.
This can be accomplished easily enough with Excel. However, it leaves me with a couple of questions. Do I have to include a calculation for each player's overall, or will it update when I load up the roster in game? Along the same lines, do I have to include a calculation for the team overall, offense, and defense ratings, or will it update when I load up the roster in game?