--- title: ProtGPT2_gradioFold emoji: 🧬 colorFrom: lighblue colorTo: blue sdk: gradio sdk_version: 3.0.4 app_file: app.py pinned: false license: mit --- Let a 735 million parameter language model dream up new sequences and predict their structures using AlphaFold. Note that only a basic AlphaFold pipeline is used with no refinement using Amber and no MSA as input (single sequence mode). The code in `app.py` is licensed under MIT license, the AlphaFold code is licensed under Apache2 license by Deepmind, the AlphaFold parameters are available under CC BY 4.0 by Deepmind. protGPT2 by Ferruz et. al. is licensed under MIT. Used libraries: - Huggingface transformers - 3Dmol.js - Tailwind CSS - Gradio - Torch and JAx - ColabFold - AlphaFold