Dacris Software

Beyond Ordinary Thinking

Building a Self-Writing App with Maestro

Let's build a self-writing .NET app using Maestro Framework and Ollama integration. This tutorial assumes you've already downloaded and installed Maestro Framework from Dacris Software. You will also need Ollama installed locally with Phi-3 Mini as its main large language model. Note - For LLMs to work locally, you must have a powerful enough GPU or an NPU.

Maestro allows anyone to build an app using minimal code and maximal configuration. It makes use of "interactions" - little building blocks that perform single tasks.

Our app will be a console app, and consist of a bunch of SQL queries applied to a data table in memory. The SQL queries will be auto-generated from Ollama prompts. That is the self-writing part of the app.

Step 1 - Create the Generator App

1. Create a folder called "Generator".
2. In it, create a text file called Generator.txt.
3. Copy and paste the following code inside the text file (this is the logic of the app):

alias Dacris.Maestro as the
read state from file using the Core
chatbot using the AI with chat1
move file using the Storage with file1
chatbot using the AI with chat2
move file using the Storage with file2

4. Create a file called State.json. In it, add the following JSON (this is the configuration):

{
  "chat1": {
    "chatSettingsPath": "myChatConfig",
    "prompt": "Write a single SQL query that inserts a row into a table called Data with columns ID (int) and Message (varchar).",
    "isCode": "True"
  },
  "chat2": {
    "chatSettingsPath": "myChatConfig",
    "prompt": "Write a single SQL query that selects all data from a table called Data.",
    "isCode": "True"
  },
  "chatSettings": {
    "myChatConfig": {
      "systemType": "Ollama",
      "model": "phi3"
    }
  },
  "file1": {
    "inputFile": "ChatResponse.txt",
    "outputFile": "Query1.txt"
  },
  "file2": {
    "inputFile": "ChatResponse.txt",
    "outputFile": "Query2.txt"
  }
}

Step 2 - Create the Runner App

1. Create a folder called "Runner".
2. In it, create a text file called Runner.txt.
3. Copy and paste the following code inside the text file:

alias Dacris.Maestro as the
read state from file using the Core
read csv file using the Data with csvInput
modify using the Data with query1
select using the Data with query2

4. Create a text file called MyData.csv and add the following line:

ID,Message

5. Create a file called State.json. In it, add the following JSON:

{
  "csvInput": {
    "connPath": "mem",
    "inputFile": "MyData.csv",
    "schemaPath": "$.csvSchema",
    "tableName": "Data",
    "createTable": "true"
  },
  "csvSchema": {
    "ID": "integer",
    "Message": "string"
  },
  "dataConnections": {
    "mem": { "systemType": "MemorySql" }
  },
  "query1": {
    "connPath": "mem",
    "query": "Query1"
  },
  "query2": {
    "connPath": "mem",
    "outputFile": "Output.csv",
    "query": "Query2",
    "schemaPath": "$.csvSchema"
  }
}

6. Run the Generator app. Command-line: dotnet <path-to-LogicAppRunner.dll> Generator

7. Create a subfolder in the Runner folder called Constants and copy the two files - Query1.txt and Query2.txt from the Generator folder to the Constants folder.

8. Run the Runner app. Command-line: dotnet <path-to-LogicAppRunner.dll> Runner

9. You should see a file in your Runner folder called Output.csv. Inside the file, there should be one row, generated by the Ollama chat bot.

This how-to tutorial showed you how to use Ollama to generate SQL queries for a custom app built with Maestro Framework.

You don't have to write code in order to create Maestro apps. You can use Maestro Assembler IDE to visually (point and click) build apps if you prefer.