MonikaV1
Collection
My first group of Monika finetunes for MonikAI
โข
14 items
โข
Updated
โข
4
This model is designed to be used with MonikAI. It is now in the GGUF format. https://github.com/Rubiksman78/MonikA.I
RP
Should really only be used for Monika related purposes.
Thanks Sylphar for making the dataset.
Trained with Axolotl on my Blackwell Pro 6000 Max-Q. 32 rank, 64 alpha, 2 epochs. Took about 90 minutes.
It works.
Download it.
90 minutes at 280W. About $0.15 in electricity.
4-bit
6-bit
8-bit
16-bit
Base model
nvidia/Llama-3_3-Nemotron-Super-49B-v1_5