How do you define “industry standard?” What “industry” are we talking about, the industry that uses products or the one that makes them?
In general, FPS is used by people talking about the framerate for their games. Their finished games.
For applications in development, FPS is less useful. For example, let’s take your statement:
noticed a huge drop in FPS (around 50 Frames), for some reason its not nearly as bad (5 Frames).
In one case, you noticed a drop of 50 FPS. In another, you got a drop of 5 FPS. The problem is that a change in FPS is meaningless without context.
For example, if you’re running at 30 FPS, and drop 5 FPS, you’ve gone down to 25 FPS, which is huge. That’s a 16.6% decline in framerate. However, if you’re running at 300 FPS (as can happen in development) and drop 5 FPS, you’re down to 295 FPS. That’s a 1.6% decline in framerate.
A change in FPS only makes sense if you know what the FPS was before the change. A change in the time it takes to render is absolute; it means the same no matter what.
Going from 30 FPS to 25 FPS represents a going from a frame time of 33.3 milliseconds to 40 milliseconds, a change of 6.67 milliseconds. That’s a lot. Going from 300 FPS to 295 FPS represents a change of 0.056 milliseconds, which is not very much at all. I wouldn’t be terribly concerned about some change that causes a 0.056 millisecond performance drop, while one that causes 6.67 ms is a real issue.
In both cases, they represent a drop of 5 FPS, but the actual time values are different. So using millisecond timings is far more useful overall than framerate.