ACM BCB - Transfer Learning for Predicting Virus-Host Protein Interactions for Novel Virus Sequences
Generalization refers to how a machine model adapts properly to new, previously unseen data. We focus on OOD (out of distribution) generalization.
Deep learning constructs networks of parameterized functional modules and is trained from reference examples using gradient-based optimization [Lecun19].
Since it is hard to estimate gradients through functions of discrete random variables, researching on how to make deep learning behave well on discrete structured data and structured representation interests us. Developing such techniques are an active research area. We focus on investigating interpretable and scalable techniques for doing so.
Have questions or suggestions? Feel free to ask me on Twitter or email me.
Thanks for reading!
Title: General Multi-label Image Classification with Transformers
Title: Curriculum Labeling- Self-paced Pseudo-Labeling for Semi-Supervised Learning”
Title: Searching for a Search Method: Benchmarking Search Algorithms for Generating NLP Adversarial Examples
Title: Reevaluating Adversarial Examples in Natural Language
Tool MUST-CNN: A Multilayer Shift-and-Stitch Deep Convolutional Architecture for Sequence-based Protein Structure Prediction
Title: Deep Learning for Character-based Information Extraction on Chinese and Protein Sequence
Tool Multitask-ProteinTagging: A unified multitask architecture for predicting local protein properties
Paper0: Learning to rank with (a lot of) word features