notbugAs an Amazon Associate I earn from qualifying purchases.
Want a good read? Try FreeBSD Mastery: Jails (IT Mastery Book 15)
Want a good monitor light? See my photosAll times are UTC
Ukraine
This referral link gives you 10% off a Fastmail.com account and gives me a discount on my Fastmail account.

Get notified when packages are built

A new feature has been added. FreshPorts already tracks package built by the FreeBSD project. This information is displayed on each port page. You can now get an email when FreshPorts notices a new package is available for something on one of your watch lists. However, you must opt into that. Click on Report Subscriptions on the right, and New Package Notification box, and click on Update.

Finally, under Watch Lists, click on ABI Package Subscriptions to select your ABI (e.g. FreeBSD:14:amd64) & package set (latest/quarterly) combination for a given watch list. This is what FreshPorts will look for.

Port details on branch 2024Q2
llama-cpp Facebook's LLaMA model in C/C++
2619 misc on this many watch lists=0 search for ports that depend on this port Find issues related to this port Report an issue related to this port View this port on Repology. pkg-fallout 2619Version of this port present on the latest quarterly branch.
Maintainer: yuri@FreeBSD.org search for ports maintained by this maintainer
Port Added: 2024-05-06 08:42:09
Last Update: 2024-05-06 08:40:10
Commit Hash: 8b7f2df
License: MIT
WWW:
https://github.com/ggerganov/llama.cpp
Description:
The main goal of llama.cpp is to enable LLM inference with minimal setup and state-of-the-art performance on a wide variety of hardware - locally and in the cloud.
Homepage    cgit ¦ Codeberg ¦ GitHub ¦ GitLab ¦ SVNWeb - no subversion history for this port

Manual pages:
FreshPorts has no man page information for this port.
pkg-plist: as obtained via: make generate-plist
Expand this list (52 items)
Collapse this list.
  1. @ldconfig
  2. /usr/local/share/licenses/llama-cpp-2619/catalog.mk
  3. /usr/local/share/licenses/llama-cpp-2619/LICENSE
  4. /usr/local/share/licenses/llama-cpp-2619/MIT
  5. bin/baby-llama
  6. bin/batched
  7. bin/batched-bench
  8. bin/beam-search
  9. bin/benchmark
  10. bin/convert-llama2c-to-ggml
  11. bin/convert-lora-to-ggml.py
  12. bin/convert.py
  13. bin/embedding
  14. bin/export-lora
  15. bin/finetune
  16. bin/gguf
  17. bin/gguf-split
  18. bin/gritlm
  19. bin/imatrix
  20. bin/infill
  21. bin/llama-bench
  22. bin/llava-cli
  23. bin/lookahead
  24. bin/lookup
  25. bin/lookup-create
  26. bin/lookup-merge
  27. bin/lookup-stats
  28. bin/main
  29. bin/parallel
  30. bin/passkey
  31. bin/perplexity
  32. bin/quantize
  33. bin/quantize-stats
  34. bin/retrieval
  35. bin/save-load-state
  36. bin/server
  37. bin/simple
  38. bin/speculative
  39. bin/tokenize
  40. bin/train-text-from-scratch
  41. include/ggml-alloc.h
  42. include/ggml-backend.h
  43. include/ggml.h
  44. include/llama.h
  45. lib/cmake/Llama/LlamaConfig.cmake
  46. lib/cmake/Llama/LlamaConfigVersion.cmake
  47. lib/libggml_shared.so
  48. lib/libllama.so
  49. lib/libllava_shared.so
  50. @owner
  51. @group
  52. @mode
Collapse this list.
Dependency lines:
  • llama-cpp>0:misc/llama-cpp
To install the port:
cd /usr/ports/misc/llama-cpp/ && make install clean
To add the package, run one of these commands:
  • pkg install misc/llama-cpp
  • pkg install llama-cpp
NOTE: If this package has multiple flavors (see below), then use one of them instead of the name specified above.
PKGNAME: llama-cpp
Flavors: there is no flavor information for this port.
distinfo:
TIMESTAMP = 1712464416 SHA256 (ggerganov-llama.cpp-b2619_GH0.tar.gz) = 5a9a982009e689ea93321269e6ee6fcdd20413ddb1480a899cea822173358e75 SIZE (ggerganov-llama.cpp-b2619_GH0.tar.gz) = 8925180

Expand this list (2 items)

Collapse this list.

SHA256 (nomic-ai-kompute-4565194_GH0.tar.gz) = 95b52d2f0514c5201c7838348a9c3c9e60902ea3c6c9aa862193a212150b2bfc SIZE (nomic-ai-kompute-4565194_GH0.tar.gz) = 13540496

Collapse this list.


Packages (timestamps in pop-ups are UTC):
llama-cpp
ABIaarch64amd64armv6armv7i386powerpcpowerpc64powerpc64le
FreeBSD:13:latest29402940--2972---
FreeBSD:13:quarterly26192619--2619---
FreeBSD:14:latest27972992--2992---
FreeBSD:14:quarterly26192619--2619---
FreeBSD:15:latest27802940n/a2749n/a--2241
Dependencies
NOTE: FreshPorts displays only information on required and default dependencies. Optional dependencies are not covered.
Build dependencies:
  1. cmake : devel/cmake-core
  2. ninja : devel/ninja
Runtime dependencies:
  1. python3.9 : lang/python39
There are no ports dependent upon this port

Configuration Options:
===> The following configuration options are available for llama-cpp-2619: EXAMPLES=on: Build and/or install examples ===> Use 'make config' to modify these settings
Options name:
misc_llama-cpp
USES:
cmake:testing compiler:c++11-lang python:run shebangfix
FreshPorts was unable to extract/find any pkg message
Master Sites:
Expand this list (1 items)
Collapse this list.
  1. https://codeload.github.com/ggerganov/llama.cpp/tar.gz/b2619?dummy=/
Collapse this list.

Number of commits found: 1

Commit History - (may be incomplete: for full details, see links to repositories near top of page)
CommitCreditsLog message
2619
06 May 2024 08:40:10
commit hash: 8b7f2df6ba5c844fb0feaf6ff3d1f704126da6eacommit hash: 8b7f2df6ba5c844fb0feaf6ff3d1f704126da6eacommit hash: 8b7f2df6ba5c844fb0feaf6ff3d1f704126da6eacommit hash: 8b7f2df6ba5c844fb0feaf6ff3d1f704126da6ea files touched by this commit
Yuri Victorovich (yuri) search for other commits by this committer
misc/llama-cpp: Broken on armv7

Reported by:	fallout

(cherry picked from commit 594f2ade6789a66fd495f326bdf2ca6050cf0800)

Number of commits found: 1