ARTICLE AD BOX

Florida plans to become the first state to eliminate vaccine mandates that have been a cornerstone of public health policy for decades in keeping schoolchildren and adults safe from infectious diseases.
Florida plans to become the first state to eliminate vaccine mandates that have been a cornerstone of public health policy for decades in keeping schoolchildren and adults safe from infectious diseases.