Stock closing price prediction using self-attention. Exploring spatial temporal task, advancing beyond the traditional view of stocks as standalone entities.
To provide accurate forecasts to aid investors and analysts in making decisions on the stocks. Determined the correlation between the stocks
This general architecture is followed by the Transformer, which uses pointwise, connected layers for the encoder and decoder. Self-attention is an attention mechanism relating different positions of a single sequence to compute a representation of the sequence. The input sequence is split and thus the relation is checked in terms of three paraments: Query, Key, and Value
Dataset Link: https://www.kaggle.com/datasets/jacksoncrow/stock-market-dataset/data