Absolutely. Workers' compensation insurance was created to pay for lost income and medical care after a workplace accident.
However, there's been a rise in cases from injured employees who didn't file for workers' compensation because their employers talked them out of making a claim.
Why You Should Always File a Workers' Compensation Claim for a Work Injury
There are many reasons employers would prefer that an employee not file a workers' compensation claim, such as increased insurance premiums, or the ability to discredit an employee at some point in the future. Even if an employer threatens to fire an employee, Florida laws prevent employers from retaliating against workers for filing workers' compensation claims.
We've represented literally thousands of injured workers at Johnson & Gilbert, P. A., and we know that it's almost always a mistake to agree to have an employer cover medical expenses outside of the workers' compensation system. Here are a few things that often happen when an employee fails to file a workers' compensation claim for a work injury:
Private health insurance provided by the employer stops paying medical bills.
Private health insurers aren't obligated to pay for injuries sustained on the job. If your insurance company discovers you've made claims for work-related injuries, it will stop paying for your appointments and treatment—and likely demand that you reimburse the company for any amount it paid to treat your work injury. Although the coverage is provided “through” the employer, the insurance company will demand payment from you, not the company you work for.
On the other hand, if you file for workers’ compensation, you'll have all of these appointments paid for and likely not have any co-payments for your treatment.
Employers suddenly refuse to pay out-of-pocket when treatment gets expensive.
An employer that promises to pay medical expenses for an employee may suddenly forget that promise when the employee needs surgery, diagnostic studies, or other costly treatments. In some cases, employers encourage (or require) employees to tell a doctor or hospital the injury didn't occur at work, ruining the employee’s credibility when he or she ends up filing a workers' compensation case later.
Employees are unable to pay bills because they're unable to work.
Unlike employer-sponsored healthcare, workers' compensation also pays a portion of lost wages to employees while they're off work treating their injuries. These payments allow the workers’ income to continue all the way through recovery.
The bottom line is this: if you're injured on the job, don't hesitate to file a workers' compensation claim. For more information, order our free book, It’s Not Rocket Science, It’s Workers’ Comp, or fill out the quick contact form on this page today to schedule your no-cost consultation with our work injury attorneys.