Lucifer 8B: A Low-Latency Transformer with Edge Mesh Deployment and Reinforcement Learning from Human Feedback
<p dir="ltr">This paper introduces <b>Lucifer 8B</b>, an 8-billion-parameter large language model (LLM) designed to combine <b>high performance, low-latency inference, and robust alignment</b>. Unlike traditional models, Lucifer 8B integrates:</p><ul&...
Saved in:
| 主要作者: | |
|---|---|
| 出版: |
2025
|
| 主題: | |
| 標簽: |
添加標簽
沒有標簽, 成為第一個標記此記錄!
|