Reiki

Definition - What does Reiki mean?

Reiki is a kind of a spiritual healing technique wherein the practitioner directs energy into the patient by touching him. The energy then restores the physical and psychological health of the patient.

However, clinical research has not proved the effectiveness of Reiki and numerous health research organizations have advised not to replace conventional therapy with Reiki.

WorkplaceTesting explains Reiki

Reiki is made of two Japanese words, rei and ki. The word rei can be roughly translated as universal life and ki as energy. It was developed in Japan by Mikao Usui 1922. According to Reiki practitioners, all living beings possess strong and free-flowing energy. When this energy gets blocked or is reduced, the physical, psychological, and emotional health of the living thing is disturbed. By helping restore the high level and the flow of energy, Reiki practitioners strive to reestablish well being.

Many top research institutions strongly advise against the replacement of conventional therapy with Reiki because the healing effects of Reiki have not been clinically proved. It is more of a spiritual art than an exact science.

This definition was written in the context of Alternative Medicine
Share this:

Connect with us

Email Newsletter

Join thousands of employment testing and employee wellness professionals.