danielrosehill 's Collections

Mobile LLMs

LLMs optimised for running "on-device" (specifically, for this collection, on smartphones with standard inference capabilities)