Science Do Americans Think the Country Is Losing or Gaining Ground in Science? January 15, 2026 11:06 am Reem Nadeem 0 Min Read SHARE Republicans and Democrats agree that it’s important the U.S. is a world leader in science, but sharply diverge on how the U.S. is faring. This post was originally published on this site Previous Article Environmental Justice and Public Health in New Haven Next Article Americans’ confidence in scientists