Must You Provide a 401(k)?
Given how ubiquitous 401(k) plans are these days, you might assume that they are practically legally mandated. The fact remains, however, that employers in the United States do not have any obligation to offer a retirement savings benefit to employees though many opt to do so now that employee pensions have all but disappeared.