U. S. Army (Noun)
Meaning
The army of the United States of America; the agency that organizes and trains soldiers for land warfare.
Classification
Nouns denoting groupings of people or objects.
The army of the United States of America; the agency that organizes and trains soldiers for land warfare.
Nouns denoting groupings of people or objects.