Unseen pictures of USA

[custom_adv]

They won the Revolutionary War and started a new country. They signed the constitution in 1787 and the Bill of Rights in 1791. George Washington, who had led the war, became its first president. During the 19th century, the United States gained much more land in the West and began to become industrialized.

Check Also

Echoes of Art at Dubai Opera

Dubai Opera stands as one of the city’s most iconic cultural landmarks—an architectural masterpiece and …