Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding a new config parameter to combine layers during FSDP #360

Open
wants to merge 10 commits into
base: main
Choose a base branch
from

Conversation

tejasnagendra
Copy link
Collaborator

Created a new class called MultiBlock, which wraps around multiple Block to reduce the number of NCCL communication. Number of blocks to combine can be controlled with num_blocks_to_combine parameter GPT/Llama.

Ideally the message size should be around 1GB to get the best performance. But when models are running on multiple nodes every layer is split into really small chunks causing the message size to be extremely small resulting in suboptimal usage of network bandwidth. This parameter can be controlled to make sure we send larger message sizes.

Copy link

google-cla bot commented Jan 30, 2024

Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA).

View this failed invocation of the CLA check for more information.

For the most up to date status, view the checks section at the bottom of the pull request.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
3 participants