CodeBERT
MIT License
MASS: Masked Sequence to Sequence Pre-training for Language Generation
Foundation Architecture for (M)LLMs
MMdnn is a set of tools to help users inter-operate among different deep learning frameworks. E.g...
General technology for enabling AI capabilities w/ LLMs and MLLMs
The implementation of DeBERTa
MLOps examples
[CVPR 2023] Official Implementation of X-Decoder for generalized decoding for pixel, image and la...
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
Large-scale pretraining for dialogue
Utilities used by the Deep Program Understanding team
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
TensorFlow 2 library implementing Graph Neural Networks
This repository contains resources for accessing the official benchmarks, codes, and checkpoints ...
Samples and Tools for Windows ML.