This model was converted from BF16 precision to int4_asym using Optimum-Intel.

Downloads last month
12
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Echo9Zulu/gpt-oss-7.2b-specialized-science-pruned-moe-only-9-experts-int4_asym-ov