{"id":54798,"date":"2025-06-24T17:42:04","date_gmt":"2025-06-24T17:42:04","guid":{"rendered":""},"modified":"2025-06-30T17:20:57","modified_gmt":"2025-06-30T23:20:57","slug":"cve-2025-49847-buffer-overflow-vulnerability-in-llama-cpp-leading-to-potential-code-execution","status":"publish","type":"post","link":"https:\/\/www.ameeba.com\/blog\/cve-2025-49847-buffer-overflow-vulnerability-in-llama-cpp-leading-to-potential-code-execution\/","title":{"rendered":"<strong>CVE-2025-49847: Buffer Overflow Vulnerability in llama.cpp Leading to Potential Code Execution.<\/strong>"},"content":{"rendered":"<p><strong>Overview<\/strong><\/p>\n<p>CVE-2025-49847 is a significant vulnerability found in the llama.cpp, a C\/C++ implementation of several LLM models. This vulnerability is of high concern due to its potential to allow an attacker to cause arbitrary memory corruption and even execute unauthorized code. This could <a href=\"https:\/\/www.ameeba.com\/blog\/cve-2025-49181-unauthorized-api-endpoint-access-leading-to-denial-of-service-and-data-leakage\/\"  data-wpil-monitor-id=\"61680\">lead to significant system compromise and data<\/a> leakage, affecting various applications and services that rely on affected versions of llama.cpp. Given the potential severity of the impact, it&#8217;s crucial for organizations to understand this <a href=\"https:\/\/www.ameeba.com\/blog\/cve-2025-39486-rankie-sql-injection-vulnerability-and-mitigation-measures\/\"  data-wpil-monitor-id=\"62401\">vulnerability and take appropriate measures to mitigate<\/a> it.<\/p>\n<p><strong>Vulnerability Summary<\/strong><\/p>\n<p>CVE ID: CVE-2025-49847<br \/>\nSeverity: High (8.8 CVSS Score)<br \/>\nAttack Vector: Network<br \/>\nPrivileges Required: None<br \/>\nUser Interaction: None<br \/>\nImpact: <a href=\"https:\/\/www.ameeba.com\/blog\/cve-2025-49415-path-traversal-vulnerability-in-fw-gallery-with-potential-for-system-compromise\/\"  data-wpil-monitor-id=\"62400\">System compromise and potential<\/a> data leakage<\/p>\n<p><strong>Affected Products<\/strong><\/p><div id=\"ameeb-4253240690\" class=\"ameeb-content-2 ameeb-entity-placement\"><div style=\"border-left: 4px solid #555; padding-left: 20px; margin: 48px 0; font-family: Roboto, sans-serif; color: #ffffff; line-height: 1.6; max-width: 700px;\">\r\n  <h2 style=\"margin-top: 0; font-size: 20px; font-weight: 600; display: flex; align-items: center;\">\r\n    <a href=\"https:\/\/www.ameeba.com\/chat\" style=\"display: inline-flex; align-items: center; margin-right: 8px;\">\r\n      <img decoding=\"async\" src=\"https:\/\/www.ameeba.com\/blog\/wp-content\/uploads\/2025\/10\/Best-App-icon-Ameeba.png\" alt=\"Ameeba Chat Icon\" style=\"width: 40px; height: 40px;\" \/>\r\n    <\/a>\r\n    A new way to communicate\r\n  <\/h2>\r\n\r\n  <p style=\"margin-bottom: 12px;\">\r\n    Ameeba Chat is built on encrypted identity, not personal profiles.\r\n  <\/p>\r\n\r\n  <p style=\"margin-bottom: 16px;\">\r\n    Message, call, share files, and coordinate with identities kept separate.\r\n  <\/p>\r\n\r\n  <ul style=\"list-style: none; padding-left: 0; margin-bottom: 20px;\">\r\n    <li>\u2022 Encrypted identity<\/li>\r\n    <li>\u2022 Ameeba Chat authenticates access<\/li>\r\n    <li>\u2022 Aliases and categories<\/li>\r\n    <li>\u2022 End-to-end encrypted chat, calls, and files<\/li>\r\n    <li>\u2022 Secure notes for sensitive information<\/li>\r\n  <\/ul>\r\n\r\n  <p style=\"font-style: italic; font-weight: 600; margin-bottom: 24px;\">\r\n    Private communication, rethought.\r\n  <\/p>\r\n\r\n  <div style=\"display: flex; flex-wrap: wrap; gap: 12px;\">\r\n    <a href=\"https:\/\/www.ameeba.com\/chat\/download\" style=\"background-color: #ffffff; color: #000000; padding: 10px 20px; text-decoration: none; border-radius: 6px; font-weight: 500;\">Download Ameeba Chat<\/a>\r\n    <a href=\"https:\/\/www.ameeba.com\/chat\" style=\"border: 1px solid #ffffff; color: #ffffff; padding: 10px 20px; text-decoration: none; border-radius: 6px; font-weight: 500;\">Learn More<\/a>\r\n  <\/div>\r\n<\/div>\r\n<\/div>\n<p>Product | Affected Versions<\/p>\n<p>llama.cpp | <a href=\"https:\/\/www.ameeba.com\/blog\/cve-2025-49137-critical-vulnerability-in-hax-cms-php-prior-to-version-11-0-0\/\"  data-wpil-monitor-id=\"61974\">Prior to version<\/a> b5662<\/p>\n<p><strong>How the Exploit Works<\/strong><\/p>\n<p>The <a href=\"https:\/\/www.ameeba.com\/blog\/cve-2025-28386-remote-code-execution-vulnerability-in-openc3-cosmos-v6-0-0\/\"  data-wpil-monitor-id=\"61447\">vulnerability lies in the vocabulary-loading code<\/a> of llama.cpp. Here, a helper function, _try_copy in llama_vocab::impl::token_to_piece(), incorrectly casts a very large size_t token length into an int32_t. This results in the bypassing of the length check (if (length < (int32_t)size)), and memcpy is still called with that oversized size. A malicious GGUF model vocabulary provided by an attacker can take advantage of this to overwrite memory beyond the intended buffer, thereby leading to arbitrary memory corruption and <a href=\"https:\/\/www.ameeba.com\/blog\/cve-2025-5491-acer-controlcenter-remote-code-execution-vulnerability-potential-system-compromise\/\"  data-wpil-monitor-id=\"61218\">potential unauthorized code execution<\/a>.<\/p>\n<p><strong>Conceptual Example Code<\/strong><\/p><div id=\"ameeb-86852924\" class=\"ameeb-content ameeb-entity-placement\"><div class=\"poptin-embedded\" data-id=\"f6b387694f681\"><\/div>\r\n\r\n\r\n\r\n\r\n\r\n<\/div>\n<p>Below is a conceptual example of how this vulnerability might be exploited. This is represented as a pseudocode for an attacker-supplied GGUF model vocabulary with an oversized token.<\/p>\n<pre><code class=\"\" data-line=\"\">\/\/ Malicious GGUF model vocabulary\nstd::string malicious_vocab = createOversizedToken();\n\/\/ Loading malicious vocabulary in llama.cpp\nllama_vocab vocab = llama_vocab::load_from_string(malicious_vocab);\n\/\/ Triggering buffer overflow\nvocab.token_to_piece(oversizedToken);<\/code><\/pre>\n<p>In this example, createOversizedToken() is a function that creates a token larger than int32_t can handle. The oversized token is then loaded into llama.cpp through the load_from_string function, and the <a href=\"https:\/\/www.ameeba.com\/blog\/cve-2025-5934-critical-stack-based-buffer-overflow-vulnerability-in-netgear-ex3700\/\"  data-wpil-monitor-id=\"61192\">buffer overflow<\/a> is triggered when token_to_piece is called with the oversized token. This could potentially lead to memory corruption and unauthorized <a href=\"https:\/\/www.ameeba.com\/blog\/cve-2025-29902-a-high-risk-remote-code-execution-vulnerability\/\"  data-wpil-monitor-id=\"61394\">code execution<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Overview CVE-2025-49847 is a significant vulnerability found in the llama.cpp, a C\/C++ implementation of several LLM models. This vulnerability is of high concern due to its potential to allow an attacker to cause arbitrary memory corruption and even execute unauthorized code. This could lead to significant system compromise and data leakage, affecting various applications and [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"om_disable_all_campaigns":false,"footnotes":""},"categories":[1],"tags":[],"vendor":[],"product":[],"attack_vector":[86],"asset_type":[],"severity":[],"exploit_status":[],"class_list":["post-54798","post","type-post","status-publish","format-standard","hentry","category-uncategorized","attack_vector-buffer-overflow"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/www.ameeba.com\/blog\/wp-json\/wp\/v2\/posts\/54798","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.ameeba.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.ameeba.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.ameeba.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.ameeba.com\/blog\/wp-json\/wp\/v2\/comments?post=54798"}],"version-history":[{"count":7,"href":"https:\/\/www.ameeba.com\/blog\/wp-json\/wp\/v2\/posts\/54798\/revisions"}],"predecessor-version":[{"id":56110,"href":"https:\/\/www.ameeba.com\/blog\/wp-json\/wp\/v2\/posts\/54798\/revisions\/56110"}],"wp:attachment":[{"href":"https:\/\/www.ameeba.com\/blog\/wp-json\/wp\/v2\/media?parent=54798"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.ameeba.com\/blog\/wp-json\/wp\/v2\/categories?post=54798"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.ameeba.com\/blog\/wp-json\/wp\/v2\/tags?post=54798"},{"taxonomy":"vendor","embeddable":true,"href":"https:\/\/www.ameeba.com\/blog\/wp-json\/wp\/v2\/vendor?post=54798"},{"taxonomy":"product","embeddable":true,"href":"https:\/\/www.ameeba.com\/blog\/wp-json\/wp\/v2\/product?post=54798"},{"taxonomy":"attack_vector","embeddable":true,"href":"https:\/\/www.ameeba.com\/blog\/wp-json\/wp\/v2\/attack_vector?post=54798"},{"taxonomy":"asset_type","embeddable":true,"href":"https:\/\/www.ameeba.com\/blog\/wp-json\/wp\/v2\/asset_type?post=54798"},{"taxonomy":"severity","embeddable":true,"href":"https:\/\/www.ameeba.com\/blog\/wp-json\/wp\/v2\/severity?post=54798"},{"taxonomy":"exploit_status","embeddable":true,"href":"https:\/\/www.ameeba.com\/blog\/wp-json\/wp\/v2\/exploit_status?post=54798"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}