Machine translation uses Natural Language Processing (NLP) to
automatically translate text across languages. Business globalization and the internet
have made it more popular. Machine translation may be handy for rapidly
comprehending foreign language content, but it is not always precise or dependable,
particularly for complicated or idiomatic languages. The research presents a neural
machine translation approach based on the sequence-to-sequence (Seq2Seq)
architecture using Uni-LSTM and Bi-LSTM with and without attention mechanisms
for translating English sentences into Hindi sentences. We investigated a variety of
procedures for the construction of machine translation models, such as the Seq2Seq
model and attention processes. We trained the model on a large parallel corpus of
English-to-Hindi sentence pairs and evaluated it on a separate test set. The efficacy of
our approach was demonstrated by the high level of BLEU score achieved, which was
14.76 by the Bi-LSTM with attention mechanism in contrast to the Uni-LSTM in
translating an English sentence into a Hindi sentence. Our research endeavours to
achieve a high level of performance in machine translation on the test set and. Our
results suggest that the proposed Seq2Seq model with attention mechanisms is a
promising approach for English-to-Hindi machine translation.
Keywords: Bi-LSTM, NLP, Neural machine translation, Recurrent neural network, Uni-LSTM.