How do you teach a machine to send text messages like a human?
This was the challenge RunGopher brought to us back in 2018, when they wanted to scale their automated messaging platform geared at customized SMS marketing campaigns. Their uniqueness is in the personalized messages sent to each user, which read similar to those written by a human, rather than sounding machine generated -- successfully drawing responses from the recipients. RunGopher already had an effective, rule-based system in place for continuing the conversation with each individual responding to the initial message, but they wanted to improve their message flow, and to make the core platform more robust and scalable. We recognized this as an ideal use case to employ cutting edge natural language processing (NLP) techniques which we proposed, and subsequently built for RunGopher.
Scaling the conversational model with deep learning
With AI, it is possible to derive sensible insights, even from vague responses. Since we already knew this was an instance needing an NLP solution, we immediately started looking at the best technique for this specific use case: what models and tools can be utilized to make the engagements more natural, and what impact would this solution ultimately have on the user experience.
While doing our research, we came across Bidirectional Encoder Representations from Transformers by Google, commonly referred to as BERT, a deep learning based technique that allows training of models for a number of downstream NLP tasks, for example sentence classification and sentiment analysis. Newly released at the time, BERT was the state of the art technique, performing significantly better than other widely used approaches. It also overcame a classic problem with the models which already existed at the time – the inability to distinguish between different usages of the same word. For example, BERT could accurately interpret the difference between a “bank” where one deposits money, and the “bank” of a river, based on the sentence structure and the context. Since the ability to discern the accurate use of words was very important in this instance, BERT was a good candidate for RunGopher, especially because the AI solution was tasked with understanding user responses and carrying out conversations just as a human would.
Accurate recognition of user intent with NLP
When working on this conversational model, we had to overcome the challenges inherent to human language, and decide on how to overcome them. For example, a language model may show subpar performance in understanding sarcasm or satire, and in these instances we must take a pragmatic approach.
In late 2018, when we were designing this solution, Google BERT was just released and frameworks like Hugging Face Transformers had not gained popularity. As a result, we had to extend BERT for tasks such as online inference and model chaining. With these extensions and custom models, we managed to successfully employ BERT for RunGopher’s use case, which performed far more robustly, achieving over 90% accuracy in recognizing user intent from their messages. It also enabled an analysis of the user responses to draw valuable business insights to further optimize the RunGopher platform.
With the help of the ConscientAI team, RunGopher possibly became one of the earliest companies to use a state of the art technique such as BERT in a production system, going beyond experimentation. This showcases our ability to adapt fast and embrace new developments in the field of AI– an albeit rapidly progressing field– to bring about successful business solutions to our clients.
If you are looking to improve business processes with similar AI solutions, reach out to us, we would love to help you!
How do you teach a machine to send text messages like a human?
This was the challenge RunGopher brought to us back in 2018, when they wanted to scale their automated messaging platform geared at customized SMS marketing campaigns. Their uniqueness is in the personalized messages sent to each user, which read similar to those written by a human, rather than sounding machine generated -- successfully drawing responses from the recipients. RunGopher already had an effective, rule-based system in place for continuing the conversation with each individual responding to the initial message, but they wanted to improve their message flow, and to make the core platform more robust and scalable. We recognized this as an ideal use case to employ cutting edge natural language processing (NLP) techniques which we proposed, and subsequently built for RunGopher.
Scaling the conversational model with deep learning
With AI, it is possible to derive sensible insights, even from vague responses. Since we already knew this was an instance needing an NLP solution, we immediately started looking at the best technique for this specific use case: what models and tools can be utilized to make the engagements more natural, and what impact would this solution ultimately have on the user experience.
While doing our research, we came across Bidirectional Encoder Representations from Transformers by Google, commonly referred to as BERT, a deep learning based technique that allows training of models for a number of downstream NLP tasks, for example sentence classification and sentiment analysis. Newly released at the time, BERT was the state of the art technique, performing significantly better than other widely used approaches. It also overcame a classic problem with the models which already existed at the time – the inability to distinguish between different usages of the same word. For example, BERT could accurately interpret the difference between a “bank” where one deposits money, and the “bank” of a river, based on the sentence structure and the context. Since the ability to discern the accurate use of words was very important in this instance, BERT was a good candidate for RunGopher, especially because the AI solution was tasked with understanding user responses and carrying out conversations just as a human would.
Accurate recognition of user intent with NLP
When working on this conversational model, we had to overcome the challenges inherent to human language, and decide on how to overcome them. For example, a language model may show subpar performance in understanding sarcasm or satire, and in these instances we must take a pragmatic approach.
In late 2018, when we were designing this solution, Google BERT was just released and frameworks like Hugging Face Transformers had not gained popularity. As a result, we had to extend BERT for tasks such as online inference and model chaining. With these extensions and custom models, we managed to successfully employ BERT for RunGopher’s use case, which performed far more robustly, achieving over 90% accuracy in recognizing user intent from their messages. It also enabled an analysis of the user responses to draw valuable business insights to further optimize the RunGopher platform.
With the help of the ConscientAI team, RunGopher possibly became one of the earliest companies to use a state of the art technique such as BERT in a production system, going beyond experimentation. This showcases our ability to adapt fast and embrace new developments in the field of AI– an albeit rapidly progressing field– to bring about successful business solutions to our clients.
If you are looking to improve business processes with similar AI solutions, reach out to us, we would love to help you!