Lucifer 8B: A Low-Latency Transformer with Edge Mesh Deployment and Reinforcement Learning from Human Feedback
<p dir="ltr">This paper introduces <b>Lucifer 8B</b>, an 8-billion-parameter large language model (LLM) designed to combine <b>high performance, low-latency inference, and robust alignment</b>. Unlike traditional models, Lucifer 8B integrates:</p><ul&...
保存先:
| 第一著者: | |
|---|---|
| 出版事項: |
2025
|
| 主題: | |
| タグ: |
タグ追加
タグなし, このレコードへの初めてのタグを付けませんか!
|