Dedicated to building industrial foundation models for universal data intelligence across industries.
MIT License
CodeBERT
Grounded Language-Image Pre-training
An efficient implementation of the popular sequence models for text generation, summarization, an...
Automatic Generation of Visualizations and Infographics using Large Language Models
JARVIS, a system to connect LLMs with ML community. Paper: https://arxiv.org/pdf/2303.17580.pdf
MASS: Masked Sequence to Sequence Pre-training for Language Generation
Large Language-and-Vision Assistant for Biomedicine, built towards multimodal GPT-4 level capabil...
This repository contains resources for accessing the official benchmarks, codes, and checkpoints ...
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
ProbTS is a benchmarking toolkit for time series forecasting.
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
This is the implementation of the paper AdaMix: Mixture-of-Adaptations for Parameter-efficient Mo...
Large-scale pretraining for dialogue
A Multi-Task Dataset for Simulated Humanoid Control
The implementation of DeBERTa