Is it worth it?
Is it worth the stress, danger, and one's own life to come to the land of the free?
Is it worth it for "minoritiies" to feel like they are less than everyone else?
Is it worth it for indivudals to feel like they will always be taught less of?
To me,
Yes.
To me,
Coming to the United States means the land of opportunity
The land of rising
The chance to see your children get an education
The opportunity to have a home and feel secure
How many times do we hear "they're taking jobs"
They're rapist
They're murderers
They don't belong here
It's painful to see others thinking this way
We all belong here
We all have a purpose
It is worth it