-
Notifications
You must be signed in to change notification settings - Fork 5
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request #38 from docker/cm/da-big-1
Start refactor to use new prompts system
- Loading branch information
Showing
20 changed files
with
5,258 additions
and
635 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,32 @@ | ||
name: Build extension on release | ||
|
||
on: | ||
release: | ||
types: [published] | ||
|
||
jobs: | ||
build: | ||
runs-on: ubuntu-latest | ||
steps: | ||
- name: "Checkout code" | ||
uses: actions/checkout@v4 | ||
|
||
- name: "Set up Node.js" | ||
uses: actions/setup-node@v2 | ||
with: | ||
node-version: "20" | ||
|
||
- name: "Install dependencies" | ||
run: npm install | ||
- name: "Build extension" | ||
run: npm run package | ||
|
||
- name: upload vsix | ||
uses: actions/upload-release-asset@v1 | ||
env: | ||
GITHUB_TOKEN: ${{ github.token }} | ||
with: | ||
upload_url: ${{ github.event.release.upload_url }} | ||
asset_path: *.vsix | ||
asset_name: ${{ github.event.release.tag_name }}.vsix | ||
asset_content_type: application/vsix |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,45 +1,30 @@ | ||
# Docker Runbook Generator | ||
# Docker AI Prompts | ||
|
||
The Docker Runbook Generator is a standalone VSCode extension to add additional runbook features on top of the experimental [Docker-VScode](https://github.com/docker/docker-vscode/) extension. | ||
|
||
## What is this project? | ||
|
||
"Make Runbook" uses generative AI and project analysis to generate a Docker specific runbook-style `README.md` to your project. | ||
|
||
See the following for an example: | ||
## What is this project? | ||
|
||
![runbook demo video](./screenshots/demo.gif) | ||
|
||
## Getting started | ||
Docker Desktop must be installed, and running with an active Docker Hub account. | ||
Docker internal users: You must be opted-out of mandatory sign-in. | ||
|
||
1. Install latest VSIX file https://github.com/docker/labs-make-runbook/releases | ||
1. Install latest VSIX file https://github.com/docker/labs-ai-tools-vscode/releases | ||
2. Open your workspace | ||
3. Execute command `>Set OpenAI API key...` and enter your OpenAI secret key. Alternatively, configure Ollama in settings `docker.make-runbook` | ||
4. Execute command `>Generate a runbook for this project` | ||
3. Execute command `>Set OpenAI API key...` and enter your OpenAI secret key. | ||
4. Execute command `>save-prompt` and enter your prompt URL/ref. | ||
Example prompt: `https://github.com/docker/labs-ai-tools-for-devs/tree/main/prompts/poem` | ||
5. Execute command `>run-prompt` | ||
|
||
This project is a research prototype. It is ready to try and will give results for any project you try it on. | ||
|
||
We are still actively working on the prompt engineering. | ||
|
||
## Development | ||
|
||
### Ollama support | ||
We use the OpenAI Typescript client, meaning all OpenAI compatible models can be used. | ||
|
||
Configure the model and endpoint using settings | ||
`docker.make-runbook.openai-base` and `docker.make-runbook.openai-model` | ||
|
||
![ollama config screenshot](./screenshots/ollama.png) | ||
|
||
### Changing prompts | ||
See [prompts README](./prompts/README.md). | ||
|
||
### Local developement | ||
|
||
```sh | ||
# docker:command=build-and-install | ||
yarn run compile | ||
yarn run package | ||
code --install-extension make-runbook-0.0.9.vsix | ||
# Outputs vsix file | ||
code --install-extension your-file.vsix | ||
``` |
Oops, something went wrong.