NLP Responses
Some NLP engines, like Dialogflow, can resolve an intent themselves by sending back a string answer. Therefore, if a user types "Hi" using the FrontM platform, it can, by default, hand control to the NLP, which will reply along the lines of "Hello, how are you?".
Developers wouldn’t need to script anything (i.e. in the relevant Intent object’s onMatching event) for this to happen; the frontm.js platform would simply fetch the answer from the NLP engine instead and send it to the user.
However, by creating a script containing intents to be matched, developers offer the ability to answer a potentially vast array of specific user queries, placing much less emphasis on the NLP.
Scripted Responses
As discussed, frontm.js apps respond to a user when, firstly, their request matches an intent scripted by the developer. If no intent is matched, the platform will hand over to the NLP engine. If there is still no match, the app will respond with a message saying that it doesn't understand the user’s request.
Once an intent is matched by frontm.js, developers resolve it by adding appropriate responses. The responses will be of the following types:
• Text
• Forms
• Lists
• Maps
• Buttons
• Cards
• Smart Suggestions
• Standard Notifications
• Critical Notifications
• Stripe Credit Card Payment Requests
• Charts
• Prompts
• Search Boxes
• Sliders
• Pictures
• Videos
In practice, developers offer responses by implementing the relevant Intent object’s onResolution event. They can, for example, call the state object’s addResponse(type, object) method which, in turn, accesses the state object’s responsesArray property (where all the responses are held):
q29.onResolution = async () => {
state.addStringResponse('OK, I started tracking your vessels');
};
If an intent is matched and the onResolution event executes with no entries in the state object’s responsesArray property, or with no errors in the state object’s errorStack property, frontm.js will respond saying that it doesn't understand. (This indicates a mistake in the app's logic and is an unlikely scenario)
Silent Responses
Under certain circumstances, the developer might want to run the onResolution event in silence. This usually happens when developers need to perform asynchronous processes. In other words, the intent of the user's message has been matched by the app and processed - but there is no immediate response. In such a scenario, the Intent Execution Hierarchy will start processing an intent at the edge and end with an intent in the cloud. The intent will match on the edge and run silently, before passing the execution to another intent that runs on the cloud and provides the final answer.
There are two ways of giving a silent response:
q29.onResolution = async () => {
if (silentAnswerRequired) {
state.addSilentResponse();
} else {
state.addStringResponse('Making some noise');
}
};
Or to make the intent completely silent:
let myIntent = new Intent('myIntent');
myIntent.silenceIntent();
myIntent.onResolution = async () => {
state.addStringResponse('This response will not show up');
};
In the first option, the intent is not marked as silent – except under certain conditions specified by the developer.
In the second option, the intent is marked as silent when it is created. Therefore, any responses created will be ignored i.e. not shown to the user.
Comments
0 comments
Please sign in to leave a comment.