I don't know if it's just me, i hope not, but it seems that romance is no longer alive..I think it's such a shame because every girl needs to experience it. In my opinion, now, people are so used to living without it that they don't acknowledge the concept of romance and just let it pass them by. This needs to change! Maybe it's different in the us, but the guys in england don't understand what the hell romance is, at all. We need it!
What do you guys think?