Having spent the past week in the US I have had the baseball "world" series trust upon me from all angles. I sat through a couple of games and attempted to get enthusiastic, but i'm afraid I ended up thinking it was a load of over hyped American bollox!
Does everyone agree that American sports, and inparticular baseball are shite? (if anyone cares the Florida Marlins beat the New York Yankees to win the world series 4-2)
Does everyone agree that American sports, and inparticular baseball are shite? (if anyone cares the Florida Marlins beat the New York Yankees to win the world series 4-2)