Parameters not affecting model using LMdeploy

#7
by hassanraha - opened

I'm currently experimenting with internvl-2 using lmdeploy as inference engine and have encountered a few issues regarding the parameter settings during inference.

  1. Parameters like temperature and top_p are not affecting results: Despite changing these parameters to various settings, I don't observe any noticeable differences in the outputs. Has anyone else experienced this issue? Are there any known solutions or workarounds to ensure these parameters influence the generation as expected?

  2. max_input_tiles parameter: I couldn't find the max_input_tiles parameter in the lmdeploy engine documentation. Could someone provide guidance on how to use this parameter during inference? Is there a specific syntax or method to include it?

Any help or insights on these issues would be greatly appreciated. Thanks in advance!

OpenGVLab org
edited Aug 21
  1. For the first question, could you provide specific test code and samples?
    2.max_input_tiles(max_dynamic_patch in lmdeploy) has been added in lmdeploy. See details in https://github.com/InternLM/lmdeploy/pull/2245 and https://github.com/InternLM/lmdeploy/pull/2292
czczup changed discussion status to closed

Sign up or log in to comment