If you just can’t shake that creeping thought that you’ve been spending Monday through Friday working for the wrong company, it’s worth exploring those feelings. Sometimes taking a step back is the best way to move forward on the right path. No good will come from staying in a place that isn’t fulfilling you creatively, professionally, or financially. Here are some signs it may be time to walk away from your job.
A decision this important should be made by you and you alone. Take your time and figure out where you want your career to take you and the best way to get there. If that route involves leaving your current gig, think very carefully about your decision and plan accordingly before doing anything rash.
Stagnation can be a career killer, and any successful professional will tell you that if you’re not habitually learning, growing, and advancing in your career, there’s a high chance you’ll eventually find yourself regressing. Misalignments with management can span from the ethical to the financial. If you’re being asked to perform tasks you disagree with, either morally or from a business perspective, it’s probably time to dust off the old resume.
If you find yourself sleep walking through your current role, ask yourself what type of job would excite you again. If it takes all of your willpower to simply muster the determination to make breakfast and get dressed, there’s a high probability your job is impacting your wellbeing more than you realize.
It’s best to walk away from jobs that leave room for nothing else in your life. Update your resume, reach out to old colleagues, begin networking again, and start browsing new job opportunities.