Can Colleges and Employers Legally Require You to Get Vaccinated? It’s Complicated.

Some colleges and employers are mandating COVID vaccines, and some states are proposing laws to prohibit vaccine mandates — leaving the unvaccinated to wonder where they stand.