Introducing Hito 1.7B: Our First Open Source Model
We are releasing Hito 1.7B — an early preview of our adaptive reasoning technology. This proof-of-concept model is free and open source under Apache 2.0. Our flagship model is coming soon.
Preview Model - Not Production Ready
This is an early preview release. The model is still under active fine-tuning and may produce inconsistent outputs. We are currently focused on training our larger flagship model. For production use, please use our API which runs our latest optimized models.
Today marks a milestone for Hitonet. We are releasing Hito 1.7B — an early preview of our adaptive reasoning technology — completely free and open source under Apache 2.0.
Hito 1.7B
Brain, Heart, and a Really Good Memory
1.7B
Parameters
32K
Context
Apache 2.0
License
GGUF
Available
Why Release a Preview?
We believe in building in public. While our production API runs significantly larger and more capable models, we wanted to share our progress with the community.
This 1.7B parameter model demonstrates our adaptive reasoning approach — it is small enough to run on consumer hardware and gives you a taste of what Hito is all about.
What Makes Hito Different
Unlike traditional AI assistants that apply rigid reasoning patterns, Hito uses adaptive cognitive reasoning — selecting the right mental tool for each challenge like a skilled craftsman.
Available Formats
- GGUF Q8 (1.8 GB) — Recommended for most users
- GGUF F16 (3.3 GB) — Maximum quality
- SafeTensors (3.4 GB) — For Transformers/PyTorch
What is Coming Next
We are actively training our flagship model with significantly improved capabilities. This preview is just a glimpse of where we are headed.
Our plan is simple: keep releasing free, open source models for the community while offering API access to help sustain our research. We believe both can coexist.
Try Hito Today
Download the preview model to experiment, or use our production API for real applications.