Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat: download from huggingface #13

Open
sammcj opened this issue Jun 3, 2024 · 8 comments
Open

Feat: download from huggingface #13

sammcj opened this issue Jun 3, 2024 · 8 comments
Labels

Comments

@sammcj
Copy link
Owner

sammcj commented Jun 3, 2024

Would tie in nicely with a modelfile templates, but could just pop up vim with a default template pre-populated with the basics.

@remon-nashid
Copy link

Would love to download HF models using gollama as the ollama registry is a bit restrictive. I see this issue is marked as complete but not sure how to access that feature. Any clues?

@sammcj
Copy link
Owner Author

sammcj commented Sep 21, 2024

Hey I just closed it off as thought maybe I skip looking into the idea, I'll consider it again but need to check overlap with other tools.

@sammcj sammcj reopened this Sep 21, 2024
@remon-nashid
Copy link

remon-nashid commented Sep 21, 2024

That's great to hear @sammcj , it would be yet another limitation of ollama (access to a wider selection of ggufs) that gollama tackles. From my point of view, it's even handier than the bi-directional linking with LM Studio, as the later feels like stepping on LM Studio's toes at times. While with ollama, you're just providing a feature that wasn't there. Just my two cents.

Really appreciate this tool and how it facilitates managing local modals.

@sammcj sammcj removed their assignment Nov 26, 2024
@sammcj sammcj added help wanted Extra attention is needed good first issue Good for newcomers and removed help wanted Extra attention is needed good first issue Good for newcomers labels Nov 26, 2024
@YugandharrPatil
Copy link

YugandharrPatil commented Feb 6, 2025

i completely agree @remon-nashid . i mean, how about we maintain a registry that we could continuously update, that gollama could pull from, which would contain code for automatic installation and configuration of "popular" models from huggingface. i mean the current state of gollama is okay; just a fancy wrapper around ollama CLI but this feature would be sick. especially for people who are not very well versed with python, pytorch, diffusers which is always a pain to setup.

@sammcj
Copy link
Owner Author

sammcj commented Feb 6, 2025

i mean the current state of gollama is okay; just a fancy wrapper around ollama CLI

Why don't you contribute to make it more than that then? @YugandharrPatil

@YugandharrPatil
Copy link

would love to if you're up for this idea. did u understand what I meant? will go through and understand the codebase soon!

@YugandharrPatil
Copy link

ohh and the "just" a fancy ollama wrapper part was constructive criticism, suggesting a new feature in the later part. in case you wrote that to provoke me, I don't really care man, it's your project.

@sammcj
Copy link
Owner Author

sammcj commented Feb 6, 2025

Not sure what's so constructive about it but ok.

If you wanted to have a go a contributing the feature you want I'd be happy to review it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants