Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Only one gpu is used when deploying interactive demo #63

Open
fanghgit opened this issue Dec 20, 2024 · 3 comments
Open

Only one gpu is used when deploying interactive demo #63

fanghgit opened this issue Dec 20, 2024 · 3 comments

Comments

@fanghgit
Copy link

Hi, VITA-1.5 looks great! When I run python -m web_demo.server --model_path demo_VITA_ckpt --ip 0.0.0.0 --port 8081 , I expect 2 gpus to be used (loading 2 models). But it appears that only one is being used? Can you help me on that?

Besides, there is another minor issues I met and I sort of fixed it by my self:

  • web_ability_demo.py and server.py has the line config_path = os.path.join(model_path, 'origin_config.json'), but there is no 'origin_config.json'. I guess one just need to rename the original config to origin_config and copy the vllm config.
@longzw1997
Copy link
Collaborator

Thank you for your attention. We have updated the code. Please download the latest version to experience it.

@ryansunyz
Copy link

I have the same question, it appears that only one is being used. And i have updatedd the code,have you solve the issue?

@fanghgit
Copy link
Author

fanghgit commented Jan 10, 2025

@ryansunyz We need to set the environment variables before import vLLM/Pytorch. This solution should help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants