In Florida, almost all employers are required to carry workers’ compensation insurance. If you are injured at work, you are entitled to receive workers’ compensation benefits to offset any lost wages and medical expenses that you may incur. Florida Workers’ Compensation Law states that workers’ compensation benefits will be provided to the employee regardless of who is at fault for the injury.
Check out answers to some of the questions we get most frequently below, and contact us today for advice on your case.