Skip to content

cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub'  #35

@cubxx

Description

@cubxx

I just input:

python anygpt/src/infer/cli_infer_chat_model.py \
--model-name-or-path models/AnyGPT-chat \
--image-tokenizer-path models/seed-tokenizer-2/seed_quantizer.pt \
--speech-tokenizer-path models/speechtokenizer/ckpt.dev \
--speech-tokenizer-config models/speechtokenizer/config.json \
--soundstorm-path models/soundstorm/speechtokenizer_soundstorm_mls.pt \
--output-dir "infer_output/chat"

then output:

RuntimeError: Failed to import transformers.models.llama.modeling_llama because of the following error (look up to see its traceback):
Failed to import transformers.generation.utils because of the following error (look up to see its traceback):
cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub' (/home/guest/anaconda3/envs/AnyGPT/lib/python3.9/site-packages/huggingface_hub/__init__.py)

I find the huggingface_hub version is 0.17.3, that is why throw error. But when I update it, pip show:

ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
tokenizers 0.14.1 requires huggingface_hub<0.18,>=0.16.4, but you have huggingface-hub 0.24.0 which is incompatible.

I can't downgrade huggingface_hub because it needs split_torch_state_dict_into_shards. So how to solve it, should I ignore pip's error?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions