Sign languages (SLs) are an essential form of communication for hearing-impaired people. However, a communication barrier still exists between the deaf community and the hearing population due to the lack of accurate automated SL communication systems. In this work, a novel SL communication system running as a mobile application has been developed to facilitate the bi-directional communication between hearing-impaired and hearing people.The proposed system utilizes Natural Language Processing (NLP) techniques, along with linguistic rules to convert between spoken language and signs, taking into account the grammatical structure of a sign language. Additionally, the system employs sign language recognition (SLR) algorithms to transform video sequences to signs, as well as hand and pose estimation algorithms to model the 3D motion of signs. Moreover, a 3D human avatar representation is employed to animate the motion of each sign in a seamless manner. Finally, a new partition of the Greek SL (GSL) dataset is formed with 1825 videos from 12 signers captured in the wild to evaluate SLR performance under realistic conditions. The proposed SL communication system and its components are validated quantitatively in GSLW as well as qualitatively by means of questionnaires, demonstrating the user satisfaction with the system.