The analysis of time series data holds significant application value in many real-world scenarios, such as forecasting power generation and traffic flow. Recently, time series forecasting models based on representation learning have garnered widespread attention due to their ability to maximally retain meta-information. However, the current representation models for time series based on learning typically employ stacked convolutional neural networks (CNNs) as their primary encoder, constructing positive and negative samples at the timestamp level for various time series data. CNNs are primarily designed to capture local patterns. Yet, in time series forecasting tasks, long-term dependencies and global patterns are more crucial, and CNNs might not be adept at capturing these patterns. To address this issue, we introduce "Contrastive Learning Enhanced by Transformer Block for Time Series Forecasting." This method initially employs the Transformer to extract features characterized by global patterns. It then uses a contrastive loss function to encourage the model to discern similarities among positive samples and differences among negative ones. Ultimately, the features learned by the model are used for forecasting. Experimental results indicate that our approach outperforms traditional methods in time series forecasting tasks.
|