Image for Institute of American Geography

Institute of American Geography

The Institute of American Geography is an organization dedicated to the study and advancement of geography in the United States. It promotes research, scholarship, and education to understand the spatial relationships and patterns that shape our environment, society, and economy. The institute organizes conferences, publishes scholarly work, and supports geographic professionals. Its goal is to foster a deeper understanding of geographic issues to inform policy and planning, ensuring that geographic knowledge benefits communities and helps address societal challenges.