Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

run / install on TermUX Android #7

Closed
lordunix opened this issue Aug 25, 2023 · 9 comments
Closed

run / install on TermUX Android #7

lordunix opened this issue Aug 25, 2023 · 9 comments

Comments

@lordunix
Copy link

lordunix commented Aug 25, 2023

not an issue

~/llama2.c $ ./run stories15M.bin
Once upon a time, there was a little boy named Timmy. Timmy was very hungry and wanted to eat a big turkey. But the turkey was on his way to a party.
Timmy tried to catch the turkey, but it was too fast. The turkey ran away and Timmy felt sad.
But then, Timmy saw a big cake. He wanted to eat it, so he cut a slice. But the cake was too big and Timmy's sister had already eaten it.
Timmy didn't like that. He wanted the cake, but his sister was being stubborn. Timmy's mom came over and saw what was happening. She told Timmy's sister to be kind and share the cake. So, Timmy's mom cut a big slice of the cake and they all enjoyed it together.
achieved tok/s: 281.859070

@trholding
Copy link
Owner

Thank you!

If you like I can add this as instruction for android + TermUX and credit you. You can also make a pull request if you like.

Can you try this:

mkdir out
wget https://huggingface.co/karpathy/tinyllamas/resolve/main/stories15M.bin
cp stories15M.bin out/model.bin
make run_gcc_static_incbin
./run

If there are errors please let me know.

I'll try your instructions when back at my home base.

@lordunix
Copy link
Author

Your are welcome

~/llama2.c $ make run_gcc_static_incbin
gcc -Ofast -static -march=native -D INC_BIN -D MODPATH=out/model.bin -D TOKPATH=tokenizer.bin -D LLOOP run.c -lm -o run
run.c:894:21: warning: assigning to 'char *' from 'const unsigned char[]' discards qualifiers [-Wincompatible-pointer-types-discards-qualifiers]
checkpoint_path = emb_Model_data;
^ ~~~~~~~~~~~~~~
run.c:895:20: warning: assigning to 'char *' from 'const unsigned char[]' discards qualifiers [-Wincompatible-pointer-types-discards-qualifiers]
tokenizer_path = emb_Tokenizer_data;
^ ~~~~~~~~~~~~~~~~~~
2 warnings generated.
ld.lld: error: unable to find library -lm
ld.lld: error: unable to find library -lc
clang-16: error: linker command failed with exit code 1 (use -v to see invocation)
make: *** [Makefile:151: run_gcc_static_incbin] Error 1

I'm looking for a solution now. can't be difficult

trholding added a commit that referenced this issue Aug 26, 2023
Added make run_incbin target for termux on Android

Usage:

In termux do:

pkg upgrade
pkg install git
pkg install make
pkg install clang
pkg install wget
git clone https://github.com/trholding/llama2.c
cd llama2.c
make run_incbin
./run

Ref: #7
@trholding
Copy link
Owner

There is a new make run_incbin target for termux.

On a fresh termux install this should work.

pkg upgrade
pkg install git
pkg install make
pkg install clang
pkg install wget
git clone https://github.com/trholding/llama2.c
cd llama2.c
make run_incbin
./run

The make target will automatically download create the out dir, download and rename the model and build.

I haven't tested yet.

@lordunix
Copy link
Author

lordunix commented Aug 26, 2023

Tested on fresh termux install and it works

~/llama2.c $ ./run

L2E $ dog and sheep
dog and sheep 2iled sheep were good friends. Every day, the sun would rise, and the sheep would run and play together.
One day, the sheep found a rough rock and decided to give it a name. "I will call you Rock," she said to himself. She decided it would be her new friend.
The dog and the sheep played together all day. They ran around the farm, chased each other and hugged the cow.
But then, it started to rain. The dog got very wet and cold. He was sad and wanted to go home. The dog could not follow the sheep and now he was lost.
achieved tok/s: 316.037736

@romanovj
Copy link

I compiled with gcc inside termux glibc enviroment

gcc -D OPENMP -Ofast -fopenmp -foffload-options="-Ofast -lm" -march=native run.c -lm -o run

cpu - sdm662

OMP_NUM_THREADS=1 ./run out/model.bin -n 256 -i "Once upon a time"

90 tok/s

OMP_NUM_THREADS=2 ./run out/model.bin -n 256 -i "Once upon a time"

120 tok/s

4 threads - 111 tok/s

@trholding
Copy link
Owner

gcc -D OPENMP -Ofast -fopenmp -foffload-options="-Ofast -lm" -march=native run.c -lm -o run

It's great that there is OpenMP support. Most probably there could be OpenACC support too.

achieved tok/s: 316.037736

Phones have a decent performance

I'll add a series of make targets for termux in the Makefile and get back here.

Do you guys think I should add these in the targets or do most users have these installed?:

For clang

pkg upgrade
pkg install git
pkg install make
pkg install clang
pkg install wget

For gcc

pkg upgrade
pkg install git
pkg install make
pkg install gcc
pkg install wget
@romanovj
Copy link

romanovj commented Aug 27, 2023

@trholding
you shouldn't add gcc for termux, it's for nerds and not popular at all, installation of glibc gcc isn't that simple

termux's clang supports openmp, libomp.a provided by clang

clang -D OPENMP -Ofast -fopenmp -march=native run.c -lm -o run

OMP_NUM_THREADS=2 ./run out/model.bin -n 256 -i "Once upon a time"

115 tok/s vs 90 without openmp

@Martin1932
Copy link

Termux error while upgrade, anyone help me to find the solution

Screenshot_20230910_215741.jpg

@trholding
Copy link
Owner

trholding commented Sep 10, 2023

Termux error while upgrade, anyone help me to find the solution

This fork is only focused on the C inference part. That is you could build run.c and run it.

I don't have much clues about getting python up in termux.

It looks like your current termux environment is not sufficient to run the python training and inference as a result of requirements not getting installed.

As seen in your screenshot, ninja fails to build / install... You could ask here for help: termux/termux-packages#8951

If you are running the python parts, please use upstream https://github.com/karpathy/llama2.c . This is a friendly fork that is 2 weeks out of sync.

Hope your issues will get resolved.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
4 participants