In today’s business environment, efficient case management and proper categorization of incoming emails are key aspects of providing excellent customer service. In this article, we will delve into and explore how to leverage the power of Azure Open AI Service to automate case classification within the CRM Customer Service module, thereby enhancing productivity and efficiency in handling incidents, enhancements, and inquiries

The objective is to review Azure OpenAI, utilize one of its models “gpt-3.5-turbo,” and integrate Dataverse with the model to create categorized cases from an email. All of this will be done automatically, and the request to the model will be made from a plugin within Dynamics 365.

Introduction Azure OpenAI

Firstly, it is important to understand what Azure Open AI Service is and what it offers. It is a service hosted on Azure that provides access to the most advanced OpenAI models such as GPT-4, GPT-3, Codex, and DALL-E. Being hosted on the Microsoft cloud, this service benefits from the enterprise-grade security and reliability provided by Azure. In essence, it offers customers the same OpenAI models but with the security and scalability capabilities of Azure.

The subscription for Azure OpenAI Service is currently limited. To access it, you need to request access and fill out the following Azure Open AI Service request form (access is not immediate).
Once you obtain access, from your Azure account, you need to create an “Azure Open AI” resource from the Marketplace:

Using Azure Open AI models with Dataverse Axazure

To create the resource, you need to fill in the following parameters:

  • MyOpenAIResource
  • OAIResourceGroup
  • eastus
  • subscriptionID

Within the resource, you can access Azure OpenAI Studio, which is a simple and user-friendly interface where you can work with the different models.

Using Azure Open AI models with Dataverse Axazure

There are many models, which can be divided by family and capability. The main ones will be highlighted, and in this article, we will work with GPT-3, specifically the “gpt-35-turbo” model.

Family Description

Models

GPT-4

Models that generate natural language and code. These models are currently in the preview version.

GPT-4, GPT-4-32K

GPT-3

Models that can understand and generate natural language.

text-davinci-003, text-curie-001, text-babbage-001, text-ada-001,gpt-35-turbo

Codex

Models that can understand and generate code, including translating natural language to code.

code-davinci-002, code-cushman-001

It is essential to understand the concept of tokens, as OpenAI models operate using tokens, and the cost of using these models depends on them. Tokens are the smallest units into which the model can divide a word for interpretation, and they do not always correspond to the number of syllables in a word. In short words, a token could be the entire word.

Each model has different billing quotas, and the billing is always calculated per 1000 tokens. Additionally, the models have a set token limit per request. In the case of gpt-3.5-turbo, the maximum limit is 4096 tokens, including both the request and the model’s response.

Model creation

Once inside Azure OpenAI Studio, you select the “Deployments” page under “Administration”. From there, you can choose the model you want to deploy and provide it a suitable name.

Using Azure Open AI models with Dataverse Axazure

When the model is successfully deployed, you can access its properties:

Using Azure Open AI models with Dataverse Axazure

From the “Open in Playground” button, you can redirect to the “Chat” and “Completions” pages, where you can access and interact with the model. There are default templates that define the model’s behavior, although they can be modified. It also provides the option to start with a blank template to configure the model from scratch.

To configure and define the behavior of the model, you can use a series of parameters.

  • Temperature: Controls randomness. Lowering the temperature means the model will generate more repetitive and deterministic responses. Increasing the temperature will result in more unexpected or creative responses. Try adjusting the temperature or top P, but not both.
  • Top P: Similar to temperature, it controls randomness but uses a different method. By reducing the top P, the model’s token selection will be restricted to outliers. If you increase the top P, the model can choose from tokens with both high and low probability. Try adjusting the temperature or top P, but not both.
  • Maximum Length (tokens): Set a limit on the number of tokens per model response. The API supports a maximum of 4000 tokens shared between the system prompt (including system message, examples, message history, and user query) and the model’s response. A token is approximately 4 characters for typical English text.
  • Stop Sequences: Make responses stop at a desired point, such as the end of a sentence or a list. Specify up to four sequences at which the model will stop generating further tokens in a response. The returned text will not contain the stop sequence.
  • Frequency Penalty: Reduce the likelihood of repeating a token proportionally based on its frequency in the text so far. This reduces the probability of exact text repetition in a response.
  • Presence Penalty: Reduce the likelihood of repeating any token that has appeared in the text so far. This increases the probability of introducing new topics in a response.
Using Azure Open AI models with Dataverse Axazure

should behave based on user questions, using the concept of roles. We have the following roles:

  • System -> Define the behavior of the assistant.
  • User -> Add an example of a possible user interaction.
  • Assistant -> Add the expected response to the user’s message.

Finally, after defining the parameters and roles of the gpt-3.5-turbo model, we can perform tests within Azure OpenAI Studio itself.

Model implementation for case categorization

In this example, we will define a model whose function is to classify emails received from different users and categorize them as enhancement, question, or incident. To do this, within the implementation of the previously created model (gpt-35-turbo), we need to define both its parameters and its behavior based on roles.

Configuration of Roles:

System:

“You are an artificial intelligence responsible for categorizing cases based on email messages received from an app. I only need the response for the category. There are 3 types: Question, Incident, and Enhancement. Question: It is categorized as a Question when a user doesn’t know how something works or has doubts about the steps to follow for a specific action. Incident: It is categorized as an Incident when the user reports a problem with their order or the received service. It also applies when the user indicates that they received something in poor condition. Enhancement: It is categorized as an Enhancement when a user expresses the need to implement a new requirement.”

Example 1:
User:
“Good afternoon, I’m having a hard time with the uploading process. I would like to request a solution to relieve me from this task.”
Assistant:
“Enhancement”

Example 2

User:

“I can’t purchase the product, the button doesn’t appear. Regards.”

Assistant:

“Question”

The more specific the system function is and the more examples are used, the higher the probability of accurate categorization.

With the model already configured, we can perform tests from the “Chat Session” to verify that the functionality is appropriate.

Using Azure Open AI models with Dataverse Axazure

The assistant’s response:

Using Azure Open AI models with Dataverse Axazure

Azure OpenAI Studio provides the possibility to make requests to the model in various languages, from the “View code” button. The following languages are available:

    • Json
    • Python
    • C#
    • Curl
Using Azure Open AI models with Dataverse Axazure

In addition to the code, it provides you with the endpoint to make the call, as well as the Key required to authenticate yourself with the service.

Using Azure Open AI models with Dataverse Axazure

Integration Plugin Dynamics 365

In a Dataverse environment, specifically in Customer Service, a scenario has been set up to categorize cases based on emails sent by the customer.

The process works as follows: when emails are received, a plugin associated with the Email entity is triggered upon creation. This plugin performs a series of checks and constructs the request using the code provided by Azure OpenAI Studio. The body of the received email is added to the object sent in the request for analysis and categorization of the message. The code provided by the model is very comprehensive but requires adaptations to fit the specific requirements.

In this example, an “HttpRequest” call has been used:

Using Azure Open AI models with Dataverse Axazure

As mentioned before, the model returns the category of the email. Once the response is received, a record is created in the Incident table with the appropriate typology and associated with the received email.

Using Azure Open AI models with Dataverse Axazure

Functionality evidences

Example 1

Email creation:

Using Azure Open AI models with Dataverse Axazure

An incident is automatically created associated with the email, with the correct typology:

Using Azure Open AI models with Dataverse Axazure
Using Azure Open AI models with Dataverse Axazure

Example 2
Email creation:

Using Azure Open AI models with Dataverse Axazure

An incident is automatically created associated with the email, with the correct typology:

Using Azure Open AI models with Dataverse Axazure
Using Azure Open AI models with Dataverse Axazure

I hope you have found it useful, the future awaits us. ;)

About the Author: Alberto Granjo

Using Azure Open AI models with Dataverse Axazure
D365CE Technical Consultant

Do you want to share?