Discussions
voyage-large-02 is not supported
I used Langchain as a wrapper to access Voyage AI Embedding. I tried to use voyage-large-02
in my local notebook and there was no any issues. But when I deployed it to my CI/CD pipeline, which uses Kubernetes pod as the instance, there's an error said:
Is there support for asynchronous requests?
For my use case, I need to make multiple non-blocking embeddings calls in parallel. I can accomplish this using an asynchronous HTTP client, like aiohttp, but I'm wondering if it's doable with the Python client.
How are the instructions passed to voyage-lite-02-instruct?
How is the specific instruction specified for voyage-lite-02-instruct? I can see the options here https://github.com/voyage-ai/voyage-lite-02-instruct/blob/main/instruct.json but there doesn't seem to be any way to specify the instruction in the python library yet
Do Voyage AI embeddings allow dimension reduction?
Do Voyage AI embeddings allow dimension trimming with Matryoshka Representation Learning like the newer OpenAI embedding models do?