What the World Series Means for the Sports World

WS 2.jpg

Another championship has come to a glorious end – another champion crowned, another wild series and another winter to prepare for next year. 

At the time this article is being formulated by my genius in my darkened dorm room of Prescott at 20:50 hours, the Tampa Bay Rays are leading the Los Angeles Dodgers 1-0 at the top of the 5th inning. However, the Dodgers are holding onto a 3-2 game lead. By the time you read this, it might be ancient history that either the Dodgers won their 7th World Series title or that the Tampa Bay Rays are fresh off their first ever World Series title. 

Whatever the outcome may be, sports are truly wonderful. Let me remind you, football may have “America’s Team,” but America’s game is baseball. I know I might get some hate when I say I never watch baseball on TV (except for when the World Series is on). Yet, I comment and talk about it as if I am a Grand Slam Champion myself. Still, I can acknowledge that baseball is best.

Baseball has been an integral part of American culture ever since its humble beginnings in 1869. Baseball has been dominated by notable superstars such as Babe Ruth, Jackie Robinson, Willie Mays and Barry Bonds. Throughout the years, there have been outright dominant teams, such as the New York Yankees and teams off their first World Series title like the beautiful Washington Nationals. 

What does this have to do with anything? Can’t you see? Sports are such an important part of American culture. COVID-19 nearly cancelled it all. So celebrate with the nation and watch some sports, not for the politics and social movements – you can watch the news for that. But watch sports to show your appreciation for the game and to share your team pride.

Joel.jpeg

Joel Shetler is a Junior

Science Ed Major from

Ruckersville, Virginia