Women Scientists in America’s History
March 23, 2017
During Women’s History Month in March, it is particularly fitting to recognize women’s vital role and contributions to our nation’s health throughout American history. American women are scientists, doctors, nurses, teachers and researchers, and they are indispensable to strong and healthy families and communities. They are pushing the boundaries of science to cure diseases.
READ MORE: Women Scientists in America’s History
No hay comentarios:
Publicar un comentario