Skip to main content

What does State Farm do?

State Farm is a group of insurance and financial services companies in the United States. The group's main business is State Farm Mutual Automobile Insurance Company, a mutual insurance firm that also owns the other State Farm companies.

Comments