Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Graph store with Anthropic LLM and Ollama embedder not working #2115

Open
acgonzales opened this issue Dec 28, 2024 · 0 comments
Open

Graph store with Anthropic LLM and Ollama embedder not working #2115

acgonzales opened this issue Dec 28, 2024 · 0 comments

Comments

@acgonzales
Copy link

acgonzales commented Dec 28, 2024

🐛 Describe the bug

I have this config and basic code to add a memory:

from mem0 import Memory

config = {
    "vector_store": {
        "provider": "qdrant",
        "config": {
            "collection_name": "test",
            "host": "localhost",
            "port": 6333,
            "embedding_model_dims": 768,
        },
    },
    "llm": {
        "provider": "anthropic",
        "config": {
            "model": "claude-3-5-sonnet-20241022",
            "temperature": 0.1,
            "max_tokens": 8192,
        },
    },
    "graph_store": {
        "provider": "neo4j",
        "config": {
            "url": "neo4j://localhost:7687",
            "username": "neo4j",
            "password": "password",
            "embedding_model_dims": 768,
        },
    },
    "embedder": {
        "provider": "ollama",
        "config": {
            "model": "nomic-embed-text:latest",
            "ollama_base_url": "http://localhost:11434",
        },
    },
    "version": "v1.1"
}

m = Memory.from_config(config)

res = m.add("I am working on improving my photography skills. Suggest some online courses.", user_id="john")
print(res)

The code above throws an error on the m.add statement.

anthropic.BadRequestError: Error code: 400 - {'type': 'error', 'error': {'type': 'invalid_request_error', 'message': 'tool_choice: Input should be a valid dictionary or object to extract fields from'}}

It works well without the graph_store key in config.

I've been seeing examples where they only have graph_store in their config but I'm having issues with that as well and this is the error I'm getting.

config = {
    # "vector_store": {
    #     "provider": "qdrant",
    #     "config": {
    #         "collection_name": "test",
    #         "host": "localhost",
    #         "port": 6333,
    #         "embedding_model_dims": 768,
    #     },
    # },
    "llm": {
        "provider": "anthropic",
        "config": {
            "model": "claude-3-5-sonnet-20241022",
            "temperature": 0.1,
            "max_tokens": 8192,
        },
    },
    "graph_store": {
        "provider": "neo4j",
        "config": {
            "url": "neo4j://localhost:7687",
            "username": "neo4j",
            "password": "password",
            "embedding_model_dims": 768,
        },
    },
    "embedder": {
        "provider": "ollama",
        "config": {
            "model": "nomic-embed-text:latest",
            "ollama_base_url": "http://localhost:11434",
        },
    },
    "version": "v1.1"
}
ValueError: shapes (0,1536) and (768,) not aligned: 1536 (dim 1) != 768 (dim 0)

Versions

  • Mem0 = 0.1.36
  • Neo4j = 4.1.13
  • Anthropic (PIP) = 0.42.0
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant