In situations such as scoring on a test, showing how much humidity there is, or determing how many people vote in an election, the amount is based on what remains out of one hundred. What gives this number authority to carry the mark of a generalized maximum? I know that for the sakes of simplicity and speed a low number on the digital scale is needed for a stopping point, but am wondering why one hundred is set apart from other numbers for this purpose. Moreover, if one hundred is the accepted limit for a percentage, then why are there statistics such as one hundred and fifty or two hundred percent? What is the ceiling for this sort of result?