Health care jobs are expanding! Is that good or bad for the country?

As it is presidential election season, we have heard, and will continue to hear, lots of discussion on the American economy and the status of employment. Policy-makers and politicians love to be able to show how many jobs they have created. These same politicians want to decrease health care costs, or at least have us think that they try hard to do so. But the main reason that governmental officials haven’t contributed to decreasing health costs in the United States, aside from worsening political polarization, is embodied in the most recent government jobs report—the health care sector creates more jobs than any other industry.

Read more of my latest piece at the Huffington Post:
goo.gl/Bt84JZ