OpenAI

Learn how to incorporate the artificial intelligence capabilities offered by OpenAI and Microsoft Azure OpenAI Service into microservices using the Devprime platform through the extensibility feature in the Extensions Adapter.

Devprime using OpenAI

Introduction

The Devprime Platform (https://devprime.io) accelerates software developer productivity by offering a complete software architecture design, components with intelligent behaviors, accelerators for code deployment, and updates with new features.

In this article, we will discuss the use of the Extensions Adapter present in the Devprime architecture, which allows you to add additional behaviors through external components based on the NuGet technology. This provides the possibility to expand the functionalities of the Devprime platform and adhere to software architecture standards, improving maintainability, reusability, and testability.

During this article we will use the nuget component Azure.AI.OpenAI which will enable access to the Microsoft Azure OpenAI and OpenAI APIs directly and you can test using the credentials of one of these services.

Checklist and preperation of the initial environment:

Example ready with all the steps of the article

This article includes a complete project that demonstrates the feature discussed. You can choose to download it by following the steps below or simply proceed to the next items.

  1. Make a clone in the repo
    git clone https://github.com/devprime/examples.git
  2. Enter the folder
    cd examples/extensions/openai/basic
  3. Upgrade your Devprime license
    dp stack
  4. Update the MongoDB/RabbitMQ settings in the src/App/appsettings.json file
  5. Update the Azure OpenAI/OpenAI credentials in the src/App/appsettings.json file
  • Locate in DevPrime_Custom the items:
    ai.url/ai.credential/ai.deploymentmodel/
  1. Run the microservice

Creating a microservice to use in the example

The first step is to create a new microservice so that we can perform customizations in the Extensions Adapter, adding the external NuGet component and preparing it to interact with the OpenAI API. The name of this microservice will be set to “ms-ai” as demonstrated in the command below.
dp new ms-ai --state mongodb --stream rabbitmq

After creating the new microservice, enter the “ms-ai” project folder and you will be able to view all implementations through Visual Studio Code as demonstrated in the article related to creation of the first microservice.

Adding a Business Rule

The business rules in the Devprime platform architecture are based on Domain-Driven Design (DDD). /../../features/domain/), and to move forward, it is necessary to add a class Aggregate Root within the Domain project. To facilitate this procedure, we will use the command below available in the Devprime CLI.
dp add aggregate AI

Preview the new class created by Visual Studio Code.
code src/Core/Domain/Aggregates/AI/AI.cs

Modify the class AI by adding a property called “Prompt” as shown in the example below.

1
2
3
4
5
namespace Domain.Aggregates.AI;
public class AI : AggRoot
{
         public string Prompt { get; private set; }
}

Using Devprime Platform Code Accelerators

Now that you’ve implemented a business rule, let’s use the Devprime CLI accelerator to generate the necessary code and start the microservice based on the initial business rule you provided.

Run the following command and type A to advance the deployments:

1
dp init

After completing this command, you will already have the basic implementations of your microservice and we will use it as a reference to move on to the next step, which involves embedding the OpenAI component in the Extension Adapter.

Adding Extension Intelligence

At this point, we will begin the procedures to enable a third-party extension provided as a NuGet component on the Devprime platform, following development standards that ensure maintainability and decoupling.

To speed up this process, we’ll use the command below, which implements the new extension in the App Service via an interface and dependency injection, and will build the required initial implementation into the adapter.

Run the command below:
dp add extensions IntelligenceService

1
2
3
4
5
6
7
8
9
# created
/src/Core/Application/Interfaces/Adapters/Extensions/IIntelligenceService.cs
/src/Adapters/Extensions/IntelligenceService/IntelligenceService.cs
# modified
/src/Core/Application/Interfaces/Adapters/Extensions/IExtensions.cs 
/src/Adapters/Extensions/Extensions.cs 
/src/Adapters/Extensions/GlobalUsings.cs 
/src/App/App.cs
/src/App/GlobalUsings.cs

The IntelligenceService class and its interface IIntelligenceService implement the integrations with the OpenAI library. On the other hand, the Extensions class and its IExtensions interface act as proxies, allowing the Devprime Pipeline execution context to access all the extensions available in the application through dependency injection.

Adding the OpenAI Component Reference

Add the NuGet component reference Azure.AI.OpenAI to the Extensions Adapter project file. Be sure to check the NuGet portal for the most current version and adjust the version if necessary:

Run the command below on a single line:

1
2
dotnet add src/Adapters/Extensions/DevPrime.Extensions.csproj package 
Azure.AI.OpenAI --version 1.0.0-beta.17

Now that you’ve just added a new component, take the opportunity to build the microservice to make sure
that everything is working properly with this new dependency.

Run the command:

1
dotnet build

Implementing OpenAI’s Component Integration

At this point, we will implement the integration with the OpenAI component within the Extensions Adapter. The first step is to modify the interface to include a method called “Conversation” and then implement the code that makes the OpenAI API call and returns the result in a format that can be transported to the context of our application.

Open the IIntelligenceService interface in Visual Studio Code.
code src/Core/Application/Interfaces/Adapters/Extensions/IIntelligenceService.cs

Replace with the code below:

1
2
3
4
5
namespace Application.Interfaces.Adapters.Extensions;
public interface IIntelligenceService
{
     string Conversation(string prompt);
}

Open the class IntelligenceService in Visual Studio Code.
code src/Adapters/Extensions/IntelligenceService/IntelligenceService.cs

Replace with the code below:

  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
using System.Collections.Generic;
using Azure;
using Azure.AI.OpenAI;

namespace DevPrime.Extensions.IntelligenceService
{
    // Main class implementing the IIntelligenceService interface
    public class IntelligenceService : DevPrimeExtensions, IIntelligenceService
    {
        // Private properties to store credentials and intelligence service configurations
        private string Credential { get; set; }
        private string Url { get; set; }
        private string DeploymentModel { get; set; }
        private string Platform { get; set; }


        // Constructor initializing properties based on DpExtensions settings
        public IntelligenceService(IDpExtensions dp) : base(dp)
        {
            Credential = Dp.Settings.Default("ai.credential");
            Url = Dp.Settings.Default("ai.url");
            DeploymentModel = Dp.Settings.Default("ai.deploymentmodel");
            Platform = ValidatePlatformSetting(Dp.Settings.Default("ai.platform") ?? "azure");
        }

        // Method to start a conversation with the AI service
        public string Conversation(string prompt)
        {
            // Using DpExtensions pipeline to execute conversation logic
            return Dp.Pipeline(ExecuteResult: () =>
            {
                // Log to indicate the start of interaction with OpenAI service
                Dp.Observability.Log("Starting OpenAI");

                // Settings for interacting with OpenAI service
                var chatCompletionsOptions = new ChatCompletionsOptions()
                {

                    Temperature = (float)0.7,
                    MaxTokens = 800,
                    NucleusSamplingFactor = (float)0.95,
                    FrequencyPenalty = 0,
                    PresencePenalty = 0,
                    DeploymentName = DeploymentModel
                };

                // Creating OpenAI client with provided credentials
                
                var openAiClient = Platform == "azure" ?
                new OpenAIClient(new System.Uri(Url), new AzureKeyCredential(Credential)) :
                new OpenAIClient(Credential);


                Dp.Observability.Log($"Starting OpenAI using platform {Platform}");

                // Adding user's message to the conversation
                // ChatRequestSystemMessage / ChatRequestUserMessage / ChatRequestAssistantMessage  
                chatCompletionsOptions.Messages.Add(new ChatRequestUserMessage(prompt));

                // Preparation for interaction with OpenAI service
                Response<ChatCompletions> response = openAiClient.GetChatCompletions(chatCompletionsOptions);

                // Processing OpenAI response
                var airesponse = response.GetRawResponse().Content.ToString();
                var root = System.Text.Json.JsonSerializer.Deserialize<Root>(airesponse);
                Message message = root.choices[0].message;
                var airesult = message.content;

                // Log to indicate the end of interaction with OpenAI service
                Dp.Observability.Log("Finalizing OpenAI");

                // Returning OpenAI response
                return airesult;
            });
        }

        // Definition of inner classes to structure OpenAI response
        public class ContentFilterResults
        {
            public bool filtered { get; set; }
            public string severity { get; set; }
        }

        public class PromptFilterResults
        {
            public int prompt_index { get; set; }
            public ContentFilterResults content_filter_results { get; set; }
        }

        public class Message
        {
            public string role { get; set; }
            public string content { get; set; }
            public ContentFilterResults content_filter_results { get; set; }
        }

        public class Choice
        {
            public int index { get; set; }
            public string finish_reason { get; set; }
            public Message message { get; set; }
            public ContentFilterResults content_filter_results { get; set; }
        }

        public class Usage
        {
            public int prompt_tokens { get; set; }
            public int completion_tokens { get; set; }
            public int total_tokens { get; set; }
        }

        public class Root
        {
            public string id { get; set; }
            public string _object { get; set; }
            public int created { get; set; }
            public string model { get; set; }
            public List<PromptFilterResults> prompt_filter_results { get; set; }
            public List<Choice> choices { get; set; }
            public Usage usage { get; set; }
        }

        private string ValidatePlatformSetting(string setting)
        {
            Dp.Observability.Log($"[Extensions]ValidatePlatformSetting:{setting}");
            var platform = setting.ToLower();
            if (platform != "azure" && platform != "openai")
            {
                Dp.Observability.Log($"[Extensions]ERROR: Platform must be either 'azure' or 'openai'.");
                throw new System.Exception("Platform must be either 'azure' or 'openai'.");
            }
            return platform;
        }
    }
}

Adding Environment Variables for Use in OpenAI

In the previous implementation we used the command Dp.Settings.Default("ai.credential") to get the parameter of an environment variable defined in the local project in the “src/App/appsettings.json” file. Open the “DevPrime_Custom” block following the example below adding the OpenAI service URL, the credential and the deployment model and finally we must inform in the item “ai.platform” the value openai or azure.

Open and edit through Visual Studio Code
code src/App/appsettings.json

1
2
3
4
5
6
7
  "DevPrime_Custom": {
    "stream.processevents": "aievents",
    "ai.url": "put azure url or openai",
    "ai.credential": "put azure key or openai",
    "ai.deploymentmodel": "put deployment name",
    "ai.platform": "change to openai or azure"
  }

Implemented the extension adapter call

The first point of contact with the new functionality made available by the new OpenAI Extension is the Handler that in this context we will use CreateAIEventHandler which has the role of intercepting a domain event and mediating the integration with the platform’s technological resources. To support Extensions we added inheritance for the EventHandlerWithExtensions class in addition to adding the IExtensions interface.

It is important to note that after the previous implementations we now have the provision within the Devprime architecture for a new method Dp.Extensions.IntelligenceService.Conversation() that executes the external code embedded within the Extensions Adapter.

This new implementation will execute the call to OpenAI and return the result. To move forward, open the file and replace all the code.

Open the Handler from Visual Studio Code.
code src/Core/Application/EventHandlers/AI/CreateAIEventHandler.cs

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
namespace Application.EventHandlers.AI;
public class CreateAIEventHandler : 
EventHandlerWithExtensions<CreateAI, IExtensions>
{
    public CreateAIEventHandler(IExtensions extensions, IDp dp) : 
    base(extensions, dp)
    {
    }
    public override dynamic Handle(CreateAI createAI)
    {
        var ai = createAI.Get<Domain.Aggregates.AI.AI>();
        var result = Dp.Extensions.IntelligenceService.Conversation(ai.Prompt);
        return result;
    }
}

Because of this modification in the Handler, remove the test file that is not used in this example
using the command below:

Removing file
rm src/Tests/Core/Application/AI/CreateAIEventHandlerTest.cs

Modifying the ADD method in Aggregate Root

Now we’ll return to the business rule in Aggregate Root to perform a customization to the “ADD” method, replacing it with the code below to allow the execution of the “CreateAI” domain event and get that return. This event will be processed in the “CreateAIEventHandler” Handler responsible for interacting with the Extensions Adapter."

Open the Aggregate Root class in Visual Studio Code
code src/Core/domain/aggregates/AI/AI.cs

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
public virtual string Add()
 {
    var result = Dp.Pipeline(ExecuteResult: () =>
    {
        ValidFields();
        ID = Guid.NewGuid();
        IsNew = true;
        var processresult = Dp.ProcessEvent<string>(new CreateAI());
        return processresult;
    });
        return result;
    }

Modifying the Interface and Application Services

Due to the implementation in the Extension Adapter, it is necessary to get the Aggregate Root return in Application Services. The first step is to modify the interface IAIService as shown in the example below.

Open the Interface in the Vsiual Studio Code
code src\Core\Application\Interfaces\Services\IAIService.cs

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
namespace Application.Interfaces.Services;
public interface IAIService
{
    string Add(Application.Services.AI.Model.AI command);
    void Update(Application.Services.AI.Model.AI command);
    void Delete(Application.Services.AI.Model.AI command);
    Application.Services.AI.Model.AI 
    Get(Application.Services.AI.Model.AI query);
    PagingResult<IList<Application.Services.AI.Model.AI>>
    GetAll(Application.Services.AI.Model.AI query);
}

After modifying the interface in the previous step, it’s time to reflect that change in Application Services. Locate the Add method below and make the replacement so that the app service returns the result of our Aggregate Root.

Open Application Services in Visual Studio Code.
code src\Core\Application\Services\AI\AIService.cs

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
    public string Add(Model.AI command)
    {
      return Dp.Pipeline(ExecuteResult: () =>
        {
            var process = command.ToDomain();
            Dp.Attach(process);
            var processresult = process.Add();
            return processresult;
        });
    }

Exploring the microservice integrated with OpenAI

At the end of this implementation, we can already perform the tests by running the microservice customizing with the OpenAI integration.

Start the microservice by running one of the scripts below:

1
.\run.ps1 (Windows) or ./run.sh (Linux, macOS)

Go to the microservice swagger portal
Devprime using OpenAI

Post to the API from Swagger by filling out the prompt

1
2
3
4
{
  "prompt": "Explain how the Devprime platform enhances developer productivity
   in one sentence?"
}

And check out the result in the JSON below:

1
2
3
4
5
6
{
  "response": "The Devprime platform streamlines development workflows
   and automates repetitive tasks,   allowing developers to focus on
   writing high-quality code and delivering projects faster.",
  "success": true
}

Follow along with the example log record automatically generated by the Observability Adapter, which details the entire internal flow, from receiving the POST to integrating with OpenAI in the Extensions Adapter.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
[App]["Powered by DevPrime"][Version]["7.0.59"][License]["Developer"]
[App]["Start"]["ms-ai"][Configuration]["Appsettings"][AppVersion]["1.0.0.0"]
[App][Idempotency]["Disable"][RID 3b8cdd1a-48cc-44ea-9d4d-af36db35e4e4]
[State][Type "MongoDB"][Alias "State1"]["Database"]["Enable"]
[App][Tenancy]["Disable"]
[App][Observability]["Enable"][Log "Enable"][Export "Enable"][Type "seq"]
[Trace "Disable"][Metrics "Disable"]
[Security]["Disable"]
[Web]["https://localhost:5001"]["http://localhost:5000"][Host]["Production"]
[Parameters]["Appsettings"]
[Stream][Type "RabbitMQ"][Alias "Stream1"]["Enable"]
[Web]["HTTP"][AI][POST /v1/ai]
[Origin "https://localhost:5001/swagger/index.html"]
[Application][AIService][Add][RID aa13bca3-76f3-4b51-b35e-c05e24e0cad3]
[Domain][AI][Add]
[Domain][AI][ProcessEvent]["CreateAI"]
[Application][EventHandler]["CreateAIEventHandler"][Event]["CreateAI"]
[Extensions][IntelligenceService][Conversation]
[Custom][Starting OpenAI]
[Custom][Finalizing OpenAI]

Using SendGrid in microservices
Using Humanizer in microservices

Last modified November 20, 2024 (61099f59)