Sign language recognition and understanding are challenging tasks for many people who are not familiar with it, which limits communication between deaf-mute people and others. The system presented in this paper lowers the communication barrier, introducing an automatic translation layer that facilitates sign language understanding. The system uses a deep-learning model for sign language detection and a separate library for hand joint mapping. The application's architecture was designed to allow users to access the system from desktop and mobile devices. The model's results revealed an 82% accuracy, and after several tweaks on the activation function in our tests, we achieved perfect classification in our real word tests. The results of the system offered excellent accuracy, and its usability lowers the communication barrier between people, providing flexibility as the application is available for any device with a browser.