![]() ![]() Llama_model_loader: - type f32: 81 tensors llama_model_loader: - type q4_0: 281 tensors llama_model_loader: - type q6_K: 1 tensors llm_load_print_meta: format = GGUF V1 ( latest) Main: seed = 1692823051 llama_model_loader: loaded meta data with 16 key- value pairs and 363 tensors from models/ llama- 13 b- v2/ ggml- model- q4_0. I./ common - O3 - std= c++ 11 - fPIC - DNDEBUG - Wall - Wextra - Wpedantic - Wcast- qual - Wno- unused- function - Wno- multichar - pthread - DGGML_USE_K_QUANTS I LDFLAGS: - framework Accelerate I CC: Apple clang version 14.0. ![]() O3 - std= c11 - fPIC - DNDEBUG - Wall - Wextra - Wpedantic - Wcast- qual - Wdouble- promotion - Wshadow - Wstrict- prototypes - Wpointer- arith - Wmissing- prototypes - pthread - DGGML_USE_K_QUANTS - DGGML_USE_ACCELERATE I CXXFLAGS: - I. I UNAME_S: Darwin I UNAME_P: arm I UNAME_M: arm64 I CFLAGS: - I. ![]() gguf - p "Building a website can be done in 10 simple steps: \nStep 1:" - n 400 - e I llama. main - m models/ llama- 13 b- v2/ ggml- model- q4_0. Here is a typical run using LLaMA v2 13B on M2 Ultra: ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |