In Ancient Rome, long before the existence of the decimal system, computations were often made in fractions which were multiples of 1/100. For example, Augustus levied a tax of 1/100 on goods sold at auction known as centesima rerum venalium. Computation with these fractions was equivalent to computing percentages. As denominations of money grew in the Middle Ages, computations with a denominator of 100 became more standard and from the late 15th century to the early 16th century it became common for arithmetic texts to include such computations. Many of these texts applied these methods to profit and loss, interest rates, and the Rule of Three. By the 17th century it was standard to quote interest rates in hundredths.
Due to inconsistent usage, it is not always clear from the context what a percentage is relative to. When speaking of a "10% rise" or a "10% fall" in a quantity, the usual interpretation is that this is relative to the initial value of that quantity. For example, if an item is initially priced at $200 and the price rises 10% (an increase of $20), the new price will be $220. Note that this final price is 110% of the initial price (100% + 10% = 110%).