Port details |
- llama-cpp Facebook's LLaMA model in C/C++
- 5195 misc
=3 4967Version of this port present on the latest quarterly branch. - Maintainer: yuri@FreeBSD.org
 - Port Added: 2024-02-15 11:27:23
- Last Update: 2025-04-27 03:38:21
- Commit Hash: e351344
- People watching this port, also watch:: autoconf, ta-lib, weberp, prestashop, irrlicht
- License: MIT
- WWW:
- https://github.com/ggerganov/llama.cpp
- Description:
- The main goal of llama.cpp is to enable LLM inference with minimal setup and
state-of-the-art performance on a wide variety of hardware - locally and in
the cloud.
¦ ¦ ¦ ¦ 
- Manual pages:
- FreshPorts has no man page information for this port.
- pkg-plist: as obtained via:
make generate-plist - Dependency lines:
-
- llama-cpp>0:misc/llama-cpp
- To install the port:
- cd /usr/ports/misc/llama-cpp/ && make install clean
- To add the package, run one of these commands:
- pkg install misc/llama-cpp
- pkg install llama-cpp
NOTE: If this package has multiple flavors (see below), then use one of them instead of the name specified above.- PKGNAME: llama-cpp
- Flavors: there is no flavor information for this port.
- distinfo:
- TIMESTAMP = 1745716899
SHA256 (ggerganov-llama.cpp-b5195_GH0.tar.gz) = 9dee0d0e9a645d232415e1d2b252fd3938f11357b430d268da17bd17db668d95
SIZE (ggerganov-llama.cpp-b5195_GH0.tar.gz) = 21069357
Packages (timestamps in pop-ups are UTC):
- Dependencies
- NOTE: FreshPorts displays only information on required and default dependencies. Optional dependencies are not covered.
- Build dependencies:
-
- glslc : graphics/shaderc
- vulkan-headers>0 : graphics/vulkan-headers
- cmake : devel/cmake-core
- ninja : devel/ninja
- Runtime dependencies:
-
- python3.11 : lang/python311
- Library dependencies:
-
- libcurl.so : ftp/curl
- libvulkan.so : graphics/vulkan-loader
- This port is required by:
- for Libraries
-
- devel/tabby
Configuration Options:
- ===> The following configuration options are available for llama-cpp-5195:
CURL=on: Data transfer support via cURL
EXAMPLES=on: Build and/or install examples
VULKAN=on: Vulkan GPU offload support
===> Use 'make config' to modify these settings
- Options name:
- misc_llama-cpp
- USES:
- cmake:testing compiler:c++11-lang python:run shebangfix localbase
- pkg-message:
- For install:
- You installed LLaMA-cpp: Facebook's LLaMA model runner.
In order to experience LLaMA-cpp please download some
AI model in the GGUF format, for example from huggingface.com,
run the script below, and open localhost:9011 in your browser
to communicate with this AI model.
$ llama-server -m $MODEL \
--host 0.0.0.0 \
--port 9011 \
-ngl 15
or
you can add the following lines to /etc/rc.conf,
start the llama-server service,
and navigate to http://localhost:8080:
> llama_server_enable=YES
> llama_server_model=/path/to/models/llama-2-7b-chat.Q4_K_M.gguf
> llama_server_args="--device Vulkan0 -ngl 27"
- Master Sites:
|
Commit History - (may be incomplete: for full details, see links to repositories near top of page) |
Commit | Credits | Log message |
2797 07 May 2024 07:17:04
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2789 → 2797
Reported by: portscout |
2789 06 May 2024 08:39:48
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: Broken on armv7
Reported by: fallout |
2789 05 May 2024 08:18:43
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2784 → 2789
Reported by: portscout |
2784 04 May 2024 08:27:21
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2780 → 2784
Reported by: portscout |
2780 02 May 2024 08:57:59
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2775 → 2780
Reported by: portscout |
2775 02 May 2024 08:57:57
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2769 → 2775
Reported by: portscout |
2769 30 Apr 2024 05:24:37
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2755 → 2769
Reported by: portscout |
2755 29 Apr 2024 07:21:41
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2751 → 2755
Reported by: portscout |
2751 28 Apr 2024 07:09:59
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2749 → 2751
Reported by: portscout |
2749 27 Apr 2024 05:55:19
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2717 → 2749
Reported by: portscout |
2717 26 Apr 2024 06:15:14
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2715 → 2717
Reported by: portscout |
2715 24 Apr 2024 09:17:15
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2710 → 2715
Reported by: portscout |
2710 22 Apr 2024 07:01:14
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2699 → 2710
Reported by: portscout |
2699 20 Apr 2024 07:27:28
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2694 → 2699
Reported by: portscout |
2694 19 Apr 2024 01:27:06
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2690 → 2694
Reported by: portscout |
2690 18 Apr 2024 06:02:41
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2687 → 2690
Reported by: portscout |
2687 17 Apr 2024 07:38:19
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2679 → 2687
Reported by: portscout |
2679 16 Apr 2024 04:15:48
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2674 → 2679 |
2674 15 Apr 2024 08:57:49
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2646 → 2674
Reported by: portscout |
2646 11 Apr 2024 08:17:16
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2619 → 2646
Reported by: portscout |
2619 07 Apr 2024 18:38:03
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2615 → 2619
Reported by: portscout |
2615 06 Apr 2024 01:10:44
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2608 → 2615
Reported by: portscout |
2608 05 Apr 2024 09:15:36
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2589 → 2608
Reported by: portscout |
2589 04 Apr 2024 08:05:04
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2581 → 2589
Reported by: portscout |
2581 31 Mar 2024 08:32:53
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2568 → 2581
Reported by: portscout |
2568 29 Mar 2024 20:03:44
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2531 → 2568
Reported by: portscout |
2531 27 Mar 2024 08:37:47
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2517 → 2531
Reported by: portscout |
2517 25 Mar 2024 05:00:39
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2509 → 2517
Reported by: portscout |
2509 24 Mar 2024 09:59:11
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2487 → 2509
Reported by: portscout |
2487 22 Mar 2024 12:22:51
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2479 → 2487
Reported by: portscout |
2479 21 Mar 2024 10:10:56
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2465 → 2479
Reported by: portscout |
2465 20 Mar 2024 08:59:51
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2450 → 2465
Reported by: portscout |
2450 18 Mar 2024 16:24:19
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2440 → 2450
Reported by: portscout |
2440 17 Mar 2024 05:40:50
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2430 → 2440
Reported by: portscout |
2430 15 Mar 2024 15:46:26
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2409 → 2430
Reported by: portscout |
2409 13 Mar 2024 06:17:05
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2405 → 2409
Reported by: portscout |
2405 12 Mar 2024 19:42:11
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2393 → 2405
Reported by: portscout |
2393 11 Mar 2024 17:53:46
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2376 → 2393
Reported by: portscout |
2376 10 Mar 2024 07:35:57
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2366 → 2376
Reported by: portscout |
2366 09 Mar 2024 07:34:18
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2360 → 2366
Reported by: portscout |
2360 08 Mar 2024 10:25:53
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2355 → 2360
Reported by: portscout |
2355 07 Mar 2024 09:48:07
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2350 → 2355
Reported by: portscout |
2350 06 Mar 2024 11:52:34
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2329 → 2350
Reported by: portscout |
2329 04 Mar 2024 16:09:17
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2294 → 2329
Reported by: portscout |
2294 27 Feb 2024 00:31:15
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2266 → 2294
Reported by: portscout |
2266 26 Feb 2024 05:55:19
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2251 → 2266
Reported by: portscout |
2251 25 Feb 2024 00:18:07
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2241 → 2251
Reported by: portscout |
2241 23 Feb 2024 10:25:42
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2234 → 2241
Reported by: portscout |
2234 22 Feb 2024 09:38:40
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2212 → 2234
Reported by: portscout |
2212 20 Feb 2024 07:09:21
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2185 → 2212
Reported by: portscout |
2185 19 Feb 2024 05:01:44
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2167 → 2185
Reported by: portscout |
2167 17 Feb 2024 08:45:30
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: update 2144 → 2167
Reported by: portscout |
2144 15 Feb 2024 11:25:01
    |
Yuri Victorovich (yuri)  |
misc/llama-cpp: New port: Facebook's LLaMA model in C/C++ |