Welcome to OpenGPT

The OpenGPT series of LLMs are based on GPTs from OpenAI: The propietary ones! Yep! They are designed for everyday chat, professional tasks, finance, coding, and data analysis. This model card shall provide the information for OpenGPT 5.5, the main model in this series.

Legal Notes

  • The making of this LLM DID NOT involve the data transfer of the propietary GPT weights.
  • The safetensor file included in the model repository is custom made, NOT a weight of GPT.
  • This involved vector mathematics for a custom tensors and is not, in any way, propietary GPT weights.

The making

Natarajan Intelligence Technologies, formerly GoodGoals, took a huge dataset from Hugging Face and asked the same queries in the dataset to GPT 5.2 and made a dataset from GPT 5.2 answers. From here, we used complex mathematics with vectors that improved the model's response through comparing the answers in the GPT dataset to the ones in the large HF dataset. The vectors were made a safetensor file and uploaded to Hugging Face! We didn't use any actual GPT weights and only used vector safetensors that improve the model's performance. To verify this, you may view the .safetensors file which contains the training examples we used in metadata format. You can also make models like this with correction vectors by using our EfficientTrainer

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support