In accordance with regulations and requirements, the editorial department's website domain has been changed to arocmag.cn. The original domain (arocmag.com) will be discontinued after Dec. 31st, 2024.
Algorithm Research & Explore
|
456-460

Wasserstein learning method for self-attention temporal point process generation model

Lu Jiaming
Li Chenlong
Wei Yiqiang
College of Mathematics, Taiyuan University of Technology, Jinzhong Shanxi 030600, China

Abstract

At present, the academic circles generally describe the temporal point process by modeling the intensity function using recurrent neural network(RNN). However, this kind of model can't capture the long-range dependence between event sequences, and the specific parameter form of the intensity function will limit the generalization ability of the model. In order to solve these problems, this paper proposed a temporal point process self-attention generation model without intensity function. The model used Wasserstein distance to construct the objective function, which was convenient to measure the deviation between the model distribution and the real distribution, and used the self-attention mechanism to describe the impact of historical events on current events, so that the model was interpretable and had stronger robustness. Comparative experiments show that, in the absence of prior knowledge of intensity function, the deviation of QQ graph slope and empirical intensity deviation of this method reduce 35.125% and 24.200% respectively compared with RNN generation model and maximum likelihood model, which proves the effectiveness of the proposed model.

Foundation Support

国家自然科学基金资助项目(61901294)
山西省应用基础研究计划资助项目(201901D211105)

Publish Information

DOI: 10.19734/j.issn.1001-3695.2021.08.0298
Publish at: Application Research of Computers Printed Article, Vol. 39, 2022 No. 2
Section: Algorithm Research & Explore
Pages: 456-460
Serial Number: 1001-3695(2022)02-022-0456-05

Publish History

[2021-10-22] Accepted Paper
[2022-02-05] Printed Article

Cite This Article

芦佳明, 李晨龙, 魏毅强. 自注意力时序点过程生成模型的Wasserstein学习方法 [J]. 计算机应用研究, 2022, 39 (2): 456-460. (Lu Jiaming, Li Chenlong, Wei Yiqiang. Wasserstein learning method for self-attention temporal point process generation model [J]. Application Research of Computers, 2022, 39 (2): 456-460. )

About the Journal

  • Application Research of Computers Monthly Journal
  • Journal ID ISSN 1001-3695
    CN  51-1196/TP

Application Research of Computers, founded in 1984, is an academic journal of computing technology sponsored by Sichuan Institute of Computer Sciences under the Science and Technology Department of Sichuan Province.

Aiming at the urgently needed cutting-edge technology in this discipline, Application Research of Computers reflects the mainstream technology, hot technology and the latest development trend of computer application research at home and abroad in a timely manner. The main contents of the journal include high-level academic papers in this discipline, the latest scientific research results and major application results. The contents of the columns involve new theories of computer discipline, basic computer theory, algorithm theory research, algorithm design and analysis, blockchain technology, system software and software engineering technology, pattern recognition and artificial intelligence, architecture, advanced computing, parallel processing, database technology, computer network and communication technology, information security technology, computer image graphics and its latest hot application technology.

Application Research of Computers has many high-level readers and authors, and its readers are mainly senior and middle-level researchers and engineers engaged in the field of computer science, as well as teachers and students majoring in computer science and related majors in colleges and universities. Over the years, the total citation frequency and Web download rate of Application Research of Computers have been ranked among the top of similar academic journals in this discipline, and the academic papers published are highly popular among the readers for their novelty, academics, foresight, orientation and practicality.


Indexed & Evaluation

  • The Second National Periodical Award 100 Key Journals
  • Double Effect Journal of China Journal Formation
  • the Core Journal of China (Peking University 2023 Edition)
  • the Core Journal for Science
  • Chinese Science Citation Database (CSCD) Source Journals
  • RCCSE Chinese Core Academic Journals
  • Journal of China Computer Federation
  • 2020-2022 The World Journal Clout Index (WJCI) Report of Scientific and Technological Periodicals
  • Full-text Source Journal of China Science and Technology Periodicals Database
  • Source Journal of China Academic Journals Comprehensive Evaluation Database
  • Source Journals of China Academic Journals (CD-ROM Version), China Journal Network
  • 2017-2019 China Outstanding Academic Journals with International Influence (Natural Science and Engineering Technology)
  • Source Journal of Top Academic Papers (F5000) Program of China's Excellent Science and Technology Journals
  • Source Journal of China Engineering Technology Electronic Information Network and Electronic Technology Literature Database
  • Source Journal of British Science Digest (INSPEC)
  • Japan Science and Technology Agency (JST) Source Journal
  • Russian Journal of Abstracts (AJ, VINITI) Source Journals
  • Full-text Journal of EBSCO, USA
  • Cambridge Scientific Abstracts (Natural Sciences) (CSA(NS)) core journals
  • Poland Copernicus Index (IC)
  • Ulrichsweb (USA)