Lucifer 8B: A Low-Latency Transformer with Edge Mesh Deployment and Reinforcement Learning from Human Feedback

<p dir="ltr">This paper introduces <b>Lucifer 8B</b>, an 8-billion-parameter large language model (LLM) designed to combine <b>high performance, low-latency inference, and robust alignment</b>. Unlike traditional models, Lucifer 8B integrates:</p><ul&...

詳細記述

保存先:
書誌詳細
第一著者: Lakshit Mathur (20894549) (author)
出版事項: 2025
主題:
タグ: タグ追加
タグなし, このレコードへの初めてのタグを付けませんか!