
This is the tutorial on how to set up NLP-SQL Chatbot in Dify on Shakudo Platform
User’s need to have access to the Dify tool API codebase here: https://github.com/devsentient/shakudo-examples/tree/main/backend-dify-opensource
Stack Components required/deployed - Dify, Ollama.
Configuration step:
(Only needed if you want to use Open Source Ollama Model)
We need to pull the model on Ollama first and then register it to the Dify models.
To pull a model in Ollama, you can run below command from sessions -
curl <http://ollama.hyperplane-ollama.svc.cluster.local:11434/api/pull> -d '{
"name": "qwen2.5:14b-instruct-q4_K_S"
}
And then register a model in Dify like below

Commit the code mentioned in prerequisite that can be accessed from Shakudo Git Server.
In Microservices panel, we need to create a new micro-service
General tab: Name your micro-service, choose the endpoint name, and choose Basic as Environment Config. In the Pipeline, choose Shell and type backend-dify-opensource/run.sh (It can be different according to your GitHub path setup), which points to the deployment script.port to be 8000 .
Code: https://github.com/devsentient/shakudo-examples/tree/main/backend-dify-opensource
Note the incluster endpoint for the service. It shall be similar to http://hyperplane-service-016559.hyperplane-pipelines.svc.cluster.local:8787
Go to Dify → Studio, and create a chatbot by importing DSL File.
You can find DSL File in the codebase named as helloSQL.yml . For reference, link here.

Open the helloSQL chatbot.
You can define your dify_tool_api endpoint in the return of Code node as bydefault value or can pass everytime when you chat with the chatbot as parameter.
dify_tool_api endpoint is the in-cluster url of the micro service created in step 1.