Large-scale pretraining for dialogue
MIT License
JARVIS, a system to connect LLMs with ML community. Paper: https://arxiv.org/pdf/2303.17580.pdf
This repository contains resources for accessing the official benchmarks, codes, and checkpoints ...
CodeBERT
Dedicated to building industrial foundation models for universal data intelligence across industr...
TensorFlow 2 library implementing Graph Neural Networks
Foundation Architecture for (M)LLMs
General technology for enabling AI capabilities w/ LLMs and MLLMs
MMdnn is a set of tools to help users inter-operate among different deep learning frameworks. E.g...
Community for applying LLMs to robotics and a robot simulator with ChatGPT integration
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
MASS: Masked Sequence to Sequence Pre-training for Language Generation
Unified-Modal Speech-Text Pre-Training for Spoken Language Processing
[CVPR 2023] Official Implementation of X-Decoder for generalized decoding for pixel, image and la...