Discuz! Board

 找回密碼
 立即註冊
搜索
熱搜: 活動 交友 discuz
查看: 6|回復: 0

While these seq2seq models were initially

[複製鏈接]

1

主題

1

帖子

5

積分

新手上路

Rank: 1

積分
5
發表於 2024-2-28 15:03:32 | 顯示全部樓層 |閱讀模式
Developed using recurrent neural networks, Transformer encoder-decoder models have recently come into vogue as they are more effective at modeling the dependencies present in the long sequences encountered in summarization. Transformer models combined with self-supervised pre-training (e.g., BERT, GPT-2, RoBERTa, XLNet, ALBERT, T5, ELECTRA) have proven to be a powerful framework for producing general language learning , achieving very high performance when queried on a wide range of linguistic tasks.

Until recently, so-called “self-monitored” objectives – used Bosnia and Herzegovina Mobile Number List in the pre-training phase – have been somewhat agnostic; Recently, however, scholars have wondered whether better performance could be obtained if the self-monitored goal more closely reflected the final task. With PEGASUS (which will appear at the 2020 International Conference on Machine Learning) a self-supervised pre-training objective (called gap sentence generation) has been designed for Transformer encoder-decoder models useful for improving commissioning performance to point on abstract summaries.



A self-monitored objective for the summary The PEGASUS developers' hypothesis is that the closer the pre-training self-surveillance goal is to the final downstream task, the better the tuning performance will be. In the pre-training phase of PEGASUS several entire sentences are removed from documents and the model is tasked with recovering them. An example of pre-training input is a document with missing sentences, while the output consists of the missing sentences concatenated together.

回復

使用道具 舉報

您需要登錄後才可以回帖 登錄 | 立即註冊

本版積分規則

Archiver|手機版|自動贊助|GameHost抗攻擊論壇

GMT+8, 2024-11-24 04:32 , Processed in 0.970379 second(s), 18 queries .

抗攻擊 by GameHost X3.4

Copyright © 2001-2021, Tencent Cloud.

快速回復 返回頂部 返回列表
一粒米 | 中興米 | 論壇美工 | 設計 抗ddos | 天堂私服 | ddos | ddos | 防ddos | 防禦ddos | 防ddos主機 | 天堂美工 | 設計 防ddos主機 | 抗ddos主機 | 抗ddos | 抗ddos主機 | 抗攻擊論壇 | 天堂自動贊助 | 免費論壇 | 天堂私服 | 天堂123 | 台南清潔 | 天堂 | 天堂私服 | 免費論壇申請 | 抗ddos | 虛擬主機 | 實體主機 | vps | 網域註冊 | 抗攻擊遊戲主機 | ddos |