OpenAI

Learn how to incorporate the artificial intelligence capabilities offered by OpenAI and Microsoft Azure OpenAI Service into microservices using the Devprime platform through the extensibility feature in the Extensions Adapter.

Devprime using OpenAI

Introduction

The Devprime platform accelerates software developer productivity by offering a complete software architecture design, components with intelligent behaviors, accelerators for code deployment, and updates with new features.

In this article, we will discuss the use of the Extensions Adapter present in the Devprime architecture, which allows you to add additional behaviors through external components based on NuGet technology. This provides the possibility to expand the functionalities of the Devprime platform and adhere to software architecture standards, improving maintainability, reusability, and testability.

During this article, we will use the nuget component Azure.AI.OpenAI that will incorporate access to the Microsoft Azure OpenAI and OpenAI API.

Cheklist and preperation of the initial environment:

Ready-made example with all the steps in the article

This article includes a complete project that demonstrates the feature discussed. You can choose to download it by following the steps below or simply proceed to the next items.

  1. Clone the repo
    git clone https://github.com/devprime/examples.git
  2. Enter the folder
    cd examples/extensions/openai/basic
  3. Upgrade your Devprime license
    dp stack
  4. Update MongoDB/RabbitMQ settings
  5. Update Azure OpenAI/OpenAI credentials on appsettings.json
  • Locate the items in DevPrime_Custom:
    ai.url /ai.credential /ai.deploymentmodel
  1. Run the microservice

Creating a microservice to use in the example

The first step is to create a new microservice so that we can make customizations to the Extensions Adapter, adding the external NuGet component and preparing it to interact with the OpenAI API. The name of this microservice will be set to “ms-ai”, as demonstrated in the command below.
dp new ms-ai --state mongodb --stream rabbitmq

After the creation of the new microservice, enter the “ms-ai” project folder and you will be able to view all the implementations by Visual Studio Code, as demonstrated in the article related to creation of the first microservice.

Adding a Business Rule

The business rules in the Devprime platform architecture are based on Domain-Driven Design (DDD), and to move forward, we need to add a Aggregate Root class inside the Domain project. To make this procedure easier, we will use the command below available in the Devprime CLI.
dp add aggregate AI

Preview the new class created by Visual Studio Code.
code src/Core/Domain/Aggregates/AI/AI.cs

Modify the AI class by adding a property called “Prompt” as shown in the example below.

1
2
3
4
5
namespace Domain.Aggregates.AI;
public class AI : AggRoot
{
         public string Prompt { get; private set; }
}

Using the Devprime Platform Code Accelerators

Now that you’ve implemented a business rule, let’s use the Devprime CLI accelerator to generate the necessary code and launch the microservice based on the initial business rule you provided.

Run the following command and type A to move forward with the implementations:

1
dp init

After completing this command, you will already have the basic implementations of your microservice and we will use it as a reference to move on to the next step, which involves embedding the OpenAI component in the Extension Adapter.

Adding the Intelligence extension

At this point, we will begin the procedures to enable a third-party extension provided as a NuGet component on the Devprime platform, following development patterns that ensure maintainability and decoupling.

To speed up this process, we’ll use the command below, which implements the new extension to the application service via an interface and dependency injection, and will build the required initial implementation on the adapter.

Run the command below:
dp add extensions IntelligenceService

1
2
3
4
5
6
7
8
9
# created
/src/Core/Application/Interfaces/Adapters/Extensions/IIntelligenceService.cs
/src/Adapters/Extensions/IntelligenceService/IntelligenceService.cs
# modified
/src/Core/Application/Interfaces/Adapters/Extensions/IExtensions.cs 
/src/Adapters/Extensions/Extensions.cs 
/src/Adapters/Extensions/GlobalUsings.cs 
/src/App/App.cs
/src/App/GlobalUsings.cs

The IntelligenceService class and its interface IIntelligenceService implement the integrations with the OpenAI library. On the other hand, the Extensions class and its interface IExtensions act as proxies, allowing the Devprime Pipeline execution context to access all the extensions available in the application through dependency injection.

Adding the reference to the OpenAI component

Add the NuGet component reference Azure.AI.OpenAI to the Extensions Adapter project file. Be sure to check the NuGet portal for the most current version and adjust the version if necessary:

Run the command below on a single line:

1
2
dotnet add src/Adapters/Extensions/DevPrime.Extensions.csproj package 
Azure.AI.OpenAI --version 1.0.0-beta.12

Now that you’ve just added a new component, take the opportunity to build the microservice to make sure
that everything is working properly with this new dependency.

Run the command:

1
dotnet build

Implementing integration with the OpenAI component

At this point, we’re going to implement the integration with the OpenAI component inside the Extensions Adapter. The first step is to modify the interface to include a method called “Conversation” and then implement the code that makes the OpenAI API call and returns the result in a format that can be transported into the context of our application.

Open the IIntelligenceService interface in Visual Studio Code.
code src/Core/Application/Interfaces/Adapters/Extensions/IIntelligenceService.cs

Replace with the code below:

1
2
3
4
5
namespace Application.Interfaces.Adapters.Extensions;
public interface IIntelligenceService
{
     string Conversation(string prompt);
}

Open the IntelligenceService class in Visual Studio Code.
code src/Adapters/Extensions/IntelligenceService/IntelligenceService.cs

Replace with the code below:

  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
using System.Collections.Generic;
using Azure;
using Azure.AI.OpenAI;

namespace DevPrime.Extensions.IntelligenceService;
public class IntelligenceService : DevPrimeExtensions, IIntelligenceService
{
    private string Credential { get; set; }
    private string URL { get; set; }
    private string DeploymentModel { get; set; }


    public IntelligenceService(IDpExtensions dp) : base(dp)
    {
        Credential = Dp.Settings.Default("ai.credential");
        URL = Dp.Settings.Default("ai.url");
        DeploymentModel = Dp.Settings.Default("ai.deploymentmodel");
    }

    public string Conversation(string prompt)
    {


        return Dp.Pipeline(ExecuteResult: () =>
        {

            Dp.Observability.Log("Starting OpenAI");

            // Prepare AI Configurations
            var chatCompletionsOptions = new ChatCompletionsOptions()
            {
                Messages =      {
                                },
                Temperature = (float)0.7,
                MaxTokens = 800,
                NucleusSamplingFactor = (float)0.95,
                FrequencyPenalty = 0,
                PresencePenalty = 0,
                DeploymentName = DeploymentModel
            };


            // Creating OpenAI Client
            var openAiClient = new OpenAIClient(new System.Uri(URL),
            new AzureKeyCredential(Credential));

            // Add prompt to conversation 
            //ChatRequestSystemMessage / ChatRequestUserMessage / ChatRequestAssistantMessage     

            chatCompletionsOptions.Messages.Add
            (new ChatRequestUserMessage(prompt));
           

            // Prepare for interaction
            Response<ChatCompletions> response =
            openAiClient.GetChatCompletions (chatCompletionsOptions);

            //OpenAI Result
            var airesponse = response.GetRawResponse().Content.ToString();
            var root = System.Text.Json.JsonSerializer.Deserialize<Root>(airesponse);
            Message message = root.choices[0].message;
            var airesult = message.content;

            Dp.Observability.Log("Finalizing OpenAI");
            return airesult;
        });

    }


    public class ContentFilterResults
{
    public bool filtered { get; set; }
    public string severity { get; set; }
}

public class PromptFilterResults
{
    public int prompt_index { get; set; }
    public ContentFilterResults content_filter_results { get; set; }
}

public class Message
{
    public string role { get; set; }
    public string content { get; set; }
    public ContentFilterResults content_filter_results { get; set; }
}

public class Choice
{
    public int index { get; set; }
    public string finish_reason { get; set; }
    public Message message { get; set; }
    public ContentFilterResults content_filter_results { get; set; }
}

public class Usage
{
    public int prompt_tokens { get; set; }
    public int completion_tokens { get; set; }
    public int total_tokens { get; set; }
}

public class Root
{
    public string id { get; set; }
    public string _object { get; set; }
    public int created { get; set; }
    public string model { get; set; }
    public List<PromptFilterResults> prompt_filter_results { get; set; }
    public List<Choice> choices { get; set; }
    public Usage usage { get; set; }
}

}

Adding environment variables for use in OpenAI

In the previous implementation we used the Dp.Settings.Default("ai.credential") command to get the parameter of an environment variable defined in the local project in the “src/App/appsettings.json” file. Open the “DevPrime_Custom” block following the example below adding the OpenAI service URL, credential and deployment template.

Open and edit from Visual Studio Code
code src/App/appsettings.json

1
2
3
4
5
6
  "DevPrime_Custom": {
    "stream.processevents": "aievents",
    "ai.url": "Insert Azure OpenAI URL / OpenAI URL",
    "ai.credential": "Insert API KEY",
    "ai.deploymentmodel": "Insert deployment name"
  }

Implemented extension adapter call

The first point of contact with the new functionality made available by the new OpenAI Extension is the Handler, which in this context we will use CreateAIEventHandler which has the role of intercepting a domain event and mediating the integration with the technological resources of the platform. To support Extensions, we’ve added inheritance for the EventHandlerWithExtensions class, as well as adding the IExtensions interface.

It is important to note that after the previous implementations we now have a new method Dp.Extensions.IntelligenceService.Conversation() that executes the external code embedded inside the Extensions Adapter.

This new implementation will execute the call to OpenAI and return the result. To move forward, open the file and replace all the code.

Open the Handler from Visual Studio Code.
code src/Core/Application/EventHandlers/AI/CreateAIEventHandler.cs

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
namespace Application.EventHandlers.AI;
public class CreateAIEventHandler : 
EventHandlerWithExtensions<CreateAI, IExtensions>
{
    public CreateAIEventHandler(IExtensions extensions, IDp dp) : 
    base(extensions, dp)
    {
    }
    public override dynamic Handle(CreateAI createAI)
    {
        var aI = createAI.Get<Domain.Aggregates.AI.AI>();
        var result = Dp.Extensions.IntelligenceService.Conversation(aI.Prompt);
        return result;
    }
}

Because of this modification in the Handler, remove the test file that is not used in this example
using the command below:

Removing file
rm src/Tests/Core/Application/AI/CreateAIEventHandlerTest.cs

Modifying the ADD Method in the Aggregate Root

Now we’ll return to the business rule in the Aggregate Root to perform a customization on the “ADD” method, replacing it with the code below to allow the “CreateAI” domain event to run and get that return. This event will be processed in the “CreateAIEventHandler” handler responsible for interacting with the Extensions Adapter."

Open the Aggregate Root class in Visual Studio Code
code src/Core/domain/aggregates/AI/AI.cs

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
public virtual string Add()
 {
    var result = Dp.Pipeline(ExecuteResult: () =>
    {
        ValidFields();
        ID = Guid.NewGuid();
        IsNew = true;
        var processresult = Dp.ProcessEvent<string>(new CreateAI());
        return processresult;
    });
        return result;
    }

Modifying the Interface and Application Services

Due to the implementation in the Extension Adapter, it is necessary to get the return of the Aggregate Root in Application Services. The first step is to modify the IAIService interface as shown in the example below.

Open the Interface in Vsiual Studio Code
code src\Core\Application\Interfaces\Services\IAIService.cs

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
namespace Application.Interfaces.Services;
public interface IAIService
{
    string Add(Application.Services.AI.Model.AI command);
    void Update(Application.Services.AI.Model.AI command);
    void Delete(Application.Services.AI.Model.AI command);
    Application.Services.AI.Model.AI 
    Get(Application.Services.AI.Model.AI query);
    PagingResult<IList<Application.Services.AI.Model.AI>>
    GetAll(Application.Services.AI.Model.AI query);
}

After the interface modification in the previous step, it’s time to reflect that change in Application Services. Locate the Add method below and override it so that the application service returns the result of our Aggregate Root.

Open Application Services in Visual Studio Code.
code src\Core\Application\Services\AI\AIService.cs

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
    public string Add(Model.AI command)
    {
      return Dp.Pipeline(ExecuteResult: () =>
        {
            var process = command.ToDomain();
            Dp.Attach(process);
            var processresult = process.Add();
            return processresult;
        });
    }

Exploring the built-in microservice with OpenAI

At the end of this implementation, we can already perform the tests by running the microservice, customizing it with the OpenAI integration.

Start the microservice by running one of the scripts below:

1
.\run.ps1 (Windows) or ./run.sh (Linux, macOS)

Go to the microservice swagger portal
Devprime using OpenAI

Post to the API via Swagger by filling out the prompt

1
2
3
4
{
  "prompt": "Explain how the Devprime platform enhances developer productivity
   in one sentence?"
}

And check out the result in the JSON below:

1
2
3
4
5
6
{
  "response": "The Devprime platform streamlines development workflows
   and automates repetitive tasks,   allowing developers to focus on
   writing high-quality code and delivering projects faster.",
  "success": true
}

Follow along with the example log record automatically generated by the Observability Adapter, which details the entire internal flow, from receiving the POST to integrating with OpenAI in the Extensions Adapter.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
[App]["Powered by DevPrime"][Version]["7.0.59"][License]["Developer"]
[App]["Start"]["ms-ai"][Configuration]["Appsettings"][AppVersion]["1.0.0.0"]
[App][Idempotency]["Disable"][RID 3b8cdd1a-48cc-44ea-9d4d-af36db35e4e4]
[State][Type "MongoDB"][Alias "State1"]["Database"]["Enable"]
[App][Tenancy]["Disable"]
[App][Observability]["Enable"][Log "Enable"][Export "Enable"][Type "seq"]
[Trace "Disable"][Metrics "Disable"]
[Security]["Disable"]
[Web]["https://localhost:5001"]["http://localhost:5000"][Host]["Production"]
[Parameters]["Appsettings"]
[Stream][Type "RabbitMQ"][Alias "Stream1"]["Enable"]
[Web]["HTTP"][AI][POST /v1/ai]
[Origin "https://localhost:5001/swagger/index.html"]
[Application][AIService][Add][RID aa13bca3-76f3-4b51-b35e-c05e24e0cad3]
[Domain][AI][Add]
[Domain][AI][ProcessEvent]["CreateAI"]
[Application][EventHandler]["CreateAIEventHandler"][Event]["CreateAI"]
[Extensions][IntelligenceService][Conversation]
[Custom][Starting OpenAI]
[Custom][Finalizing OpenAI]

Using SendGrid in microservices
Using Humanizer in microservices

Last modified April 16, 2024 (2b35fcc8)