Definition - What does Worker's Compensation mean?
Worker's compensation (or workers' compensation) is a type of government mandated insurance that employers provide for their employees. This insurance is designed to provide financial compensation if a worker suffers a work related injury or disability that results in time away from work. The compensation from this type of insurance is designed to assist with making up for lost wages and assist with making medical payments.
WorkplaceTesting explains Worker's Compensation
Worker's compensation insurance is meant to be a safety net for workers injured due to their job. Typically, in order to receive worker's compensation benefits, the employee must agree not to sue the employer over the injury/accident. Worker's compensation covers occupational illnesses that can be proven to be a direct result of the employment, such as a lab tech being infected with AIDS due to an injury from an infected needle. Worker's compensation provides a variety of benefits including wage replacement, medical treatment, and vocational rehabilitation.