Skip to content

Feature/remove timm clip flags#106

Open
jain18ayush wants to merge 11 commits intoPrisma-Multimodal:mainfrom
jain18ayush:feature/remove-timm-clip-flags
Open

Feature/remove timm clip flags#106
jain18ayush wants to merge 11 commits intoPrisma-Multimodal:mainfrom
jain18ayush:feature/remove-timm-clip-flags

Conversation

@jain18ayush
Copy link
Contributor

@jain18ayush jain18ayush commented Jul 5, 2024

reconfigured the flags for #98 based off the transformer configs

Copy link
Collaborator

@themachinefan themachinefan Jul 6, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's messy but is_clip actually still loads from hugging face transformer library, we don't use the clip library.
Based on the following code it seems like maybe all you need to do is check if AutoConfig has an attribute 'vision_config'

elif is_clip:  # Extract vision encoder from dual-encoder CLIP model.
    hf_config = AutoConfig.from_pretrained(model_name).vision_config

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok. Thanks for the tip!

@jain18ayush jain18ayush marked this pull request as ready for review July 10, 2024 18:04
@lowlorenz
Copy link
Collaborator

What is the status of this PR? @soniajoseph is there something preventing us from merging this?

@soniajoseph
Copy link
Collaborator

Apologies for the delay here and thank you @jain18ayush for the code!

We've moved largely to openclip models so it's not clear to me that the list-matching works here. This is something to adapt as soon as Karolis pushes the OpenCLIP adaptation code which should come with a list of models to check. I'll tag him here as soon as he accepts my invite to this repo.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants

Comments