Self-supervised Contrastive Representation Learning for Semi-supervised Time-Series Classification

Abstract

This work proposes a novel Time-Series semi-supervised representation learning framework, CA-TCC, that learns representations given few labeled data using contrastive learning. CA-TCC builds on top of our TS-TCC work by leveraging the robust pseudo labels from the fine-tuned TS-TCC model in a supervised contrastive loss.

Publication
IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI)