Conversation
|
Can this also support models deployed with sagemaker endpoint? Not sure because in aws console the arn are like: |
|
Thanks for the contribution. I am afraid I can't merge the code for now. The repo is using converse API to access Bedrock Models. But right now, there are very limited support on custom imported models for the converse API. So it's confusing to have such feature included at this stage. We may also want to deal with the cold start of imported models in the future. |
No, this is limited to just bedrock model imports (which is different from bedrock fine tuned models).
@daixba It would be useful to see the roadmap for custom model import. Right now the integration is very much lacking and it seems like the bedrock is not investing into custom model import. |
Signed-off-by: Sean Smith <sean.smith@contextual.ai>
Signed-off-by: Sean Smith <sean.smith@contextual.ai>
e2e7d25 to
a0982a4
Compare
This allows users to call models they've imported (if enabled by an environment variable):
For example:
These models also show up in the model list:
This resolves #99
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.