‘’Women play a key role in healthcare’’

Women play a key role in healthcare. Globally, women working in the healthcare sector is increasing. The healthcare sector is a good place for women to work and constitute a major work force with the sector.

In the healthcare sector, we feel women can play an important role. We believe women overall are better and compassionate in their approach towards the patients. The number of women doctors in the industry is also on the rise and are over taking men. Nursing has been a women forte. It is a belief that women are compassionate and concerned, where they are able to connect with the patients, more than males.

The increasing women in healthcare is a positive trend, from the view point of patient care. Also, now we see women occupying key positions in healthcare sector, which is encouraging.

Dr B S Ajaikumar, Chairman, HealthCare Global, Bangalore

Comments (0)
Add Comment