Skip to content

Commit

Permalink
Upgrade jax to 0.4 in batch notebook too
Browse files Browse the repository at this point in the history
  • Loading branch information
milot-mirdita committed Dec 15, 2023
1 parent 110e2b6 commit ebfa87e
Showing 1 changed file with 5 additions and 3 deletions.
8 changes: 5 additions & 3 deletions batch/AlphaFold2_batch.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -127,9 +127,11 @@
"if [ ! -f COLABFOLD_READY ]; then\n",
" # install dependencies\n",
" # We have to use \"--no-warn-conflicts\" because colab already has a lot preinstalled with requirements different to ours\n",
" pip install -q --no-warn-conflicts \"colabfold[alphafold-minus-jax] @ git+https://github.com/sokrypton/ColabFold\" \"tensorflow-cpu==2.11.0\"\n",
" pip uninstall -yq jax jaxlib\n",
" pip install -q \"jax[cuda]==0.3.25\" -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html\n",
" pip install -q --no-warn-conflicts \"colabfold[alphafold-minus-jax] @ git+https://github.com/sokrypton/ColabFold\"\n",
" pip install --upgrade dm-haiku\n",
" ln -s /usr/local/lib/python3.*/dist-packages/colabfold colabfold\n",
" ln -s /usr/local/lib/python3.*/dist-packages/alphafold alphafold\n",
" sed -i 's/weights = jax.nn.softmax(logits)/logits=jnp.clip(logits,-1e8,1e8);weights=jax.nn.softmax(logits)/g' alphafold/model/modules.py\n",
" touch COLABFOLD_READY\n",
"fi\n",
"\n",
Expand Down

0 comments on commit ebfa87e

Please sign in to comment.