Dispelling the Rumors of Nursing Degrees

Posted December 9th, 2009 by Site Administrator in Uncategorized (No Comments »)

Nursing jobs have become somewhat glamorized by Hollywood over the recent increase of hospital shows within the past few years including ER, Grey’s Anatomy, and Scrubs.  While the surgeons take over much of the time on Grey’s Anatomy, nurses are still a huge part of the show, and some of the main characters of ER and Scrubs are the head nurses of the respective hospitals.  Nursing degrees do lead to the ending result of the ability to save lives and help rehabilitate patients, but they do typically are not as glamorous as these shows make them appear to be.  Nursing degrees require students to be fully committed to the health industry and fully committed to their patients at the same time.

Nursing degrees can now be attained at a number of online institutions, making the RN requirement that much easier to earn, although many students are disappointed by their ending career placement.  Working aside sexy surgeons is not what propels most students to become a nurse, and the fact that these TV shows cater to false notions about the health industry is not helping health education.  A true nursing career requires hours of dedication to patients who need constant care and attention; there is little time for the social hours that so many of these shows indicate health care professionals have. 

Nursing degrees additionally require years of hard work and courses, requiring programs to weed out the students who think they can breeze through a health-care profession.  These students are left struggling when their first patient has a seizure or cannot stop bleeding.  However, despite dispelling the rumors of constant parties and social gatherings on TV shows, nursing careers are incredibly rewarding and uplifting.  Every day nurses are allowed the opportunity to connect with patients on a personal level and are the people in charge of monitoring their levels after surgery and performing examinations.

Drawing the line between fact and fiction is hard for any highly popularized career, but the health care industry has especially seen a rise in interest among the American public as a result of the many medical shows that have plagued airwaves in recent years.  While many of these shows do indicate what many hospitals have to undergo (once in a while), the vast majority only seem to glamorize a career which is meant to give back to the community and work to save lives.  Nursing degrees are a major part of this community, and bringing reality back to students is a necessary part of their schooling. 

Did you enjoy this article?

AddThis Social Bookmark Button

Leave a Reply