• RadioFreeArabia@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    6
    ·
    1 month ago

    No, the US has been an empire from the start. Unless you don’t count conquering and colonizing the indigenous peoples because they aren’t “civilized” or something.