Loading...
Loading...
Browse all stories on DeepNewz
VisitMeta Unveils LLM Compiler, Surpassing GPT-4 with 7B and 13B Parameters
Jun 27, 2024, 05:41 PM
Meta has announced the Meta LLM Compiler, a family of models built on Meta Code Llama with additional code optimization and compiler capabilities. These models can emulate the compiler, predict optimal passes for code size, and disassemble code. Notably, the Meta LLM Compiler beats GPT-4 on code size improvement and disassembly, achieving 77% of the optimizing potential of an autotuning search and 45% disassembly round trip. The models, which work with x_86 assembly and LLVM-IR, are available with 7B and 13B parameters and can be fine-tuned for new tasks. This release marks a significant advancement in AI-driven code optimization.
View original story
Markets
No • 50%
Yes • 50%
Press releases, official announcements from major tech firms (e.g., Google, Microsoft, Amazon)
No • 50%
Yes • 50%
Publicly available benchmark results from reputable sources (e.g., MLPerf, academic papers)
No • 50%
Yes • 50%
Official announcements from Meta, press releases
Meta LLM Compiler > • 33%
Equal market share • 33%
GPT-4 > • 33%
Market analysis reports from reputable firms (e.g., Gartner, IDC)
More than 500 • 33%
100 to 500 • 33%
Less than 100 • 33%
GitHub repositories, public GitHub data
More than 90% • 33%
Less than 80% • 33%
80% to 90% • 33%
Published performance benchmarks and studies