Interest rate "logic" is frequently justified with multiplicative accounting. Multiplicative accounting is just the practice of comparing quantities proportionally, using mulplication. There are 2 problems with multiplicative accounting:
It may sound superfluous to worry about interest rates down to many decimal places, but this adds up very quickly. If you are buying a half million dollar home, not too uncommon in the U.S. today, then a hundredth of a percent, or one ten-thousandth, is $50 a year. Most of us would notice if someone tried to sneak in an extra $50, even if the price tag was pretty big $500,050. But how many people are going to care about a 2.01% interest rate over 2%?
When you consider a something bigger, the issue of lost precision becomes even more important.
A 0.01% change in the interest on the national debt amounts to $3 billion dollars a year, which is about $30 per U.S. citizen, or 6,000 half a million dollar homes.
If you make 1% per year on a billion dollars, that's 10 million dollars. If you make 1% a year on $10, that's only 10 cents. By using multiplicative accounting, it looks like paying someone with $10 of net wealth is as impactful as paying
This gets even worse once you consider that many people have an essentially negative "net worth", though stricly in financial terms that should lead to bankruptcy, it is hard to measure intangibles and lifetime potential earnings, both of which should technically be included in net worth.
But if someone's financial accounts are in the red, this type of measurement makes it seem fair to charge them more money.
In reality, scale matters. You can't simply take the $27 trillion dollars that are currently invested in treasury bonds, and put them into Apple Stock instead. Apple would never be able to handle that, and the company would become completely unproductive.
Instead of trying to turn everything into products and ratios, just add things up and compare them.