Investors make capital investment by buying stocks and expect to get a certain income from the stock market. When buying stocks, they need to draw up investment plans based on various information such as stock market historical transaction data and related news data of listed companies and collect and analyze these data. The data are relatively cumbersome and require a lot of time and effort. If you only rely on subjective analysis, the reference factors are often not comprehensive enough. At the same time, Internet social media, such as the speech in stock forums, also affect the judgment and behavior of investors, and investor sentiment will have a positive or negative effect on the stock market. This has an impact on the trend of stock prices. Therefore, this article proposes a stock market prediction model that uses data preprocessing technology based on past stock market transaction data to establish a stock market prediction model, and secondly, an image description generation model based on a generative confrontation network is designed. The model includes a generator and a discriminator. A time-varying preattention mechanism is proposed in the generator. This mechanism allows each image feature to pay attention to the image features of other stock markets to predict stock market trends so that the decoder can better understand the relational information in the image. The discriminator is based on the recurrent neural network and considers the degree of matching between the input sentence and the 4 reference sentences and the image features. Experiments show that the accuracy of the model is higher than that of the stock pretrend forecast model based on historical data, which proves the effectiveness of the data used in this paper in the stock price trend forecast.
Mendeley helps you to discover research relevant for your work.
CITATION STYLE
Xiao, D. (2021). Research on the Application of Generative Adversarial Networks in the Generation of Stock Market Forecast Trend Images. Scientific Programming, 2021. https://doi.org/10.1155/2021/7321671