Skip to content

[BUG: No such file or directory: 'mistral_models/Mixtral-8x7b-Instruct-v0.1/params.json' #242

@Lorenz5622

Description

@Lorenz5622

Python -VV

(mixtral) root@autodl-container-4c3247ac55-3930647f:~/autodl-tmp/mistral-inference/mistral_models# python eval.py
Traceback (most recent call last):
  File "/root/autodl-tmp/mistral-inference/mistral_models/eval.py", line 16, in <module>
    model = Transformer.from_folder(mistral_models_path)
  File "/root/autodl-tmp/mistral-inference/src/mistral_inference/transformer.py", line 306, in from_folder
    with open(Path(folder) / "params.json", "r") as f:
FileNotFoundError: [Errno 2] No such file or directory: 'mistral_models/Mixtral-8x7b-Instruct-v0.1/params.json'

Pip Freeze

annotated-types==0.7.0
anyio==4.9.0
attrs==23.2.0
backports.tarfile==1.2.0
build==1.2.2.post1
CacheControl==0.14.2
certifi==2025.1.31
cffi==1.17.1
charset-normalizer==3.4.1
cleo==2.1.0
crashtest==0.4.1
cryptography==44.0.2
distlib==0.3.9
docstring_parser==0.16
dulwich==0.22.8
exceptiongroup==1.2.1
fastjsonschema==2.21.1
filelock==3.14.0
findpython==0.6.3
fire==0.6.0
fsspec==2024.5.0
h11==0.14.0
httpcore==1.0.7
httpx==0.28.1
huggingface-hub==0.29.3
idna==3.10
importlib_metadata==8.6.1
iniconfig==2.0.0
installer==0.7.0
jaraco.classes==3.4.0
jaraco.context==6.0.1
jaraco.functools==4.1.0
jeepney==0.9.0
Jinja2==3.1.4
jsonschema==4.21.1
jsonschema-specifications==2023.12.1
keyring==25.6.0
MarkupSafe==2.1.5
mistral_common==1.5.4
-e git+https://github.com/mistralai/mistral-inference.git@6eb35510403825cfb430b0004443053e8c4b70dc#egg=mistral_inference
more-itertools==10.6.0
mpmath==1.3.0
msgpack==1.1.0
mypy==1.10.0
mypy-extensions==1.0.0
mypy-protobuf==3.6.0
networkx==3.2.1
numpy==1.26.4
nvidia-cublas-cu12==12.1.3.1
nvidia-cuda-cupti-cu12==12.1.105
nvidia-cuda-nvrtc-cu12==12.1.105
nvidia-cuda-runtime-cu12==12.1.105
nvidia-cudnn-cu12==8.9.2.26
nvidia-cufft-cu12==11.0.2.54
nvidia-curand-cu12==10.3.2.106
nvidia-cusolver-cu12==11.4.5.107
nvidia-cusparse-cu12==12.1.0.106
nvidia-cusparselt-cu12==0.6.2
nvidia-nccl-cu12==2.20.5
nvidia-nvjitlink-cu12==12.5.40
nvidia-nvtx-cu12==12.1.105
packaging==24.0
pbs-installer==2025.3.17
pillow==11.1.0
pkginfo==1.12.1.2
platformdirs==4.3.7
pluggy==1.5.0
poetry==2.1.2
poetry-core==2.1.2
protobuf==5.27.0
pycparser==2.22
pydantic==2.9.2
pydantic_core==2.23.4
pyproject_hooks==1.2.0
pytest==7.4.4
PyYAML==6.0.2
RapidFuzz==3.12.2
referencing==0.35.1
regex==2024.11.6
requests==2.32.3
requests-toolbelt==1.0.0
rpds-py==0.18.1
ruff==0.2.2
safetensors==0.4.3
SecretStorage==3.3.3
sentencepiece==0.2.0
shellingham==1.5.4
simple_parsing==0.1.5
six==1.16.0
sniffio==1.3.1
sympy==1.12
termcolor==2.4.0
tiktoken==0.9.0
tokenizers==0.21.1
tomli==2.0.1
tomlkit==0.13.2
torch==2.3.0
tqdm==4.67.1
transformers==4.50.3
triton==2.3.0
trove-classifiers==2025.3.19.19
types-protobuf==4.24.0.20240129
typing-inspection==0.4.0
typing_extensions==4.12.0
urllib3==2.3.0
virtualenv==20.29.3
xformers==0.0.26.post1
zipp==3.21.0
zstandard==0.23.0

Reproduction Steps

  1. follow the instructions in Installation
  2. download Mixtral-8x7b-Instruct-v0.1 from hugging face
  3. use intruction: torchrun --nproc-per-node 2 --no-python mistral-demo $M8x7B_DIR
  4. output: [rank1]: FileNotFoundError: [Errno 2] No such file or directory: '/root/autodl-tmp/mistral-inference/mistral_models/Mixtral-8x7b-Instruct-v0.1/params.json'
  5. follow the instruction in hugging face, the same output like "no params.json".

Expected Behavior

Please add file params.json into mistralai/Mixtral-8x7B-Instruct-v0.1 in huggingface, so I can successfully use mixtral-inference, thanks.

Additional Context

No response

Suggested Solutions

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions