skip to Main Content

I’m working on a fastAPI project on an Amazon EC2 instance running Ubuntu 20.04.5. The nature of the project requires me to have several custom types (written by me) and third party types (from HuggingFace transformers, hf peft, Langchain) as fields in my Model schema. When I try to run my fastAPI application, though, I consistently get this error for each custom/third party type I use as a field:

RuntimeError: no validator found for <class 'peft.peft_model.PeftModelForCausalLM'>, see `arbitrary_types_allowed` in Config

I’ve done everything I can find online/in docs to fix this error. I’ll show some snippets of my code that should show what I’ve tried so far.

My custom Model Schema look like this

class SaicModelForTextGen(BaseModel):
    """
    Loads a model for Text Generation. Will require some time to load checkpoint shards 
    once called. 
    """

    model_config = ConfigDict(arbitrary_types_allowed=True)

    prompt: str = BASE_PROMPT_TEXTGEN
    
    model, tokenizer = load_text_generation()

    conversation: Conversation = Conversation()

    @validator('model', check_fields=False)
    def validate_model(cls, value):
        return validate(value)

the field ‘model’ is of type PeftModelForCausalLM (shown in the error). The function validate(value) redirects to a simple validation function that immediately returns the object when called.

In the api itself, a get request that uses this class would look like this:

@app.get("/", response_model=None)
def init_text_gen(base_prompt: str = BASE_PROMPT_TEXTGEN) -> SaicModelForTextGen:
    return SaicModelForTextGen(prompt=base_prompt)

setting response_model=None was a suggestion I saw on docs/internet that would provide a workaround for my error, but so far hasn’t changed anything.

I’ve also experimented with older/newer versions of FastAPI and pydantic without success either. I haven’t been able to find anyone online who has experienced the same issue as me after implementing custom validator functions and setting response_model=None.

Does anyone know of a workaround/solution to this issue? Thanks in advance for your help 🙂 – I’ll list the versions/libraries I’m working with below

I’m working with LLMs on a GPU, so there’s a lot of other libraries involved in the project, but I mentioned the ones that I thought would be helpful since they seem to be directly involved in this issue. Thanks again!

EDIT: Forgot to include the actual libraries. Worried this is a dependency issues somehow.

fastapi 0.95.0

unicorn 0.23.2

pydantic 1.10.12

python 3.11

transformers 4.31.0

peft 0.4.0

2

Answers


  1. Chosen as BEST ANSWER

    I eventually reached a solution to this error. I was working with custom types I trusted, so I bypassed validation entirely with the BaseModel.construct() method. For other with a similar issue, you should only do this if you can already trust the data that's going into your custom model, because it's definitely a dangerous method (although safe in my case as I mentioned).


  2. The issue arises likely due to the fact that pydantic tries to validate all datatypes of your model (is an int really an int etc) and you have some custom datatypes that pydantic does not know how to handle/validate out of the Box. You could tell it to allow those datatypes nevertheless and validate them yourself though (as you already do with your validator):

    class SaicModelForTextGen(BaseModel):
        model: CustomType = your_model
    
        class Config:
            arbitrary_types_allowed = True
    

    You can find more info on this and other model config options here

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search