IMO using 1 / single frame rate for FPS seems dumb and should just be shown as shortest frame time.
Technically, the term used is 'Min/Max'. It does not say min fps or max fps.
You can always contact the developer and ask how he calculates it.
1 / time for single slowest (or fastest) frame is a very useful statistic
To give an unrealistic example, suppose it takes half a second to calculate one frame and then it generates 98 frames in 0.5 second.
Then your fps is going to be 99. It seems that it runs smoothly if you would only look at the fps for that one second, but in reality it did not run smoothly.
Running a third party tool such as Mangohud under Linux shows some frame times much higher and lower than reported FPS.
This gives the impression that one of the two tools may not be reliable.
If you look at the source code of both tools you can probably see where the differences lie.
This reminds me of the following article:
https://vermaden.wordpress.com/2022/07/12/desktop-environments-resource-usage-comparison/
Looking at the article you quickly come to the conclusion that each tool seems to calculate RAM usage differently.
There is software in circulation anyway that is just not reliable.
Many programming languages that are used are not optimal for the task for which they are used.
No super small percentage of programmers are weak in math, and many programmers are only really good at one or two programming languages (or not really strong in any programming language)
I am not saying this applies to this situation, but it is one possibility.