Currently much of the US carries private healthcare insurance. Many of these plans focus only on “necessary treatments”. After learning about the determinants of health, do you think there should be legislation requiring insurance plans to cover health promotion and disease prevention (i.e. quitting smoking, weight loss programs in order to prevent illness in the patient’s future)? In other words, should more of the financial focus be on prevention or treatment?