colonialism
The word colony comes from the Latin word colonus, meaning farmer; indicating the transfer of people to land. Colonialism is the act of power and domination of one nation, by acquiring or maintaining full or partial political control over another sovereign nation. The country or nation that comes under the control of another foreign nation, is known as a colony of the dominating country. While the two are related, colonialism should not be confused with imperialism; which involves the outward use of military and economic power, and always aims for more expansion and collective domination.
To learn more about colonialism see this Florida A & M University Law Review Article.
See also: terra nullius, doctrine of discovery, settler colonialism
[Last reviewed in April of 2022 by the Wex Definitions Team]
Wex