[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Bug#1063673: ITP: llama.cpp -- Inference of Meta's LLaMA model (and others) in pure C/C++



I tried building the CPU edition on one machine and run it on another,
and experienced illegal instruction exceptions.  I suspect this mean one
need to be careful when selecting build profile to ensure it work on all
supported Debian platforms.

I would be happy to help getting this up and running.  Please let me
know when you have published a git repo with the packaging rules.

-- 
Happy hacking
Petter Reinholdtsen


Reply to: