This model was converted from BF16 precision to int4_asym using Optimum-Intel.
- Downloads last month
- 12
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
This model was converted from BF16 precision to int4_asym using Optimum-Intel.