Use BERT: state of the art language models are available in different sizes. BERT can predict the next sentence, so it is used in question answering and chatbots. Feed in question, predicts answer; feed in statement from person A, generates (predicts) appropriate response from person B.
Typically people take a standard pre-trained BERT model and fine tune it to output text that is similar to the desired text. I think that this is done by adding some layers and then training the layers using back propagation.
Here are a couple of useful links. Simple Chatbot using BERT and Pytorch: Part 1e, and Extending Google-BERT as Question and Answering model and Chatbot.
David Gamez and I came up with this idea, so he may also be interested in supervising a thesis around these ideas.