Long story short...... Is it Just me, or does the United States Of America seem so bland and boring nowadays compared to other countries?
You see, No big news ... No scandals ... Kobe's out... Dull Show business, Over Made-up, Unreal, fictional Movie series all over a dry garden of shit .. Exciting Television my ass...Man, It's Totally Boring!!!!! I can't imagine Life without America on the Centre stage. So i'm just wondering, and i beg to ask the following: