# Pembrolizumab-scFv Optimiziation Variants Iter1 x PD-1 (YM_0985) ## Overview YM_0985 includes Alphabind designs against PD-1. We explored several model hypothesis: (i) Does pre-training aid predicitivity and (ii) does the featurization of the input sequences matter. To test pretraining, we refer to `mata_descriptions` with the term **warm** to include pretraining, and **cold** to start from a randomly initialized seed. For featurization, we explored **label-encoded** sequences with a one-hot-encoder of amino acid identities, versus an **ESM-featurized** embedding to represent each sequence in the PPI. ## Experimental details We studied the efficacy of generating binders with different model hyperparameters. This dataset includes 34890 unique VHHs and 1 unique RBD sequences. A more extensive methods section can be found in our publication [here](https://pmc.ncbi.nlm.nih.gov/articles/PMC12296056/). ## Misc dataset details We define the following binders: ### A-library (scFvs) There are several terms you can filter by: - `Pembro144_WT_`: These are WT replicates. - `Pembro144_label_encoded_cold`: Label encoded sequences with no pretraining - `Pembro144_label_encoded_warm`: Label encoded sequences with pretraining - `Pembro144_esm_cold`: ESM featurized sequences with no pretraining - `Pembro144_esm_warm`: ESM featurized sequences with pretraining To get the mutations of interest relative to the parent, we recommend an alignment to the WT sequence. ### Alpha-library There is only 1 sequence, which is the native target.