Not to be a Debbie downer or anything, but do people realize that raises aren't something that people are entitled to? I've been a manager at a few different banks, a restaurant and Target, and sometimes you don't get a raise, even when you deserve it. Companies aren't mandated or "supposed"...