Filtern nach
Letzte Suchanfragen

Ergebnisse für *

Zeige Ergebnisse 1 bis 2 von 2.

  1. Machine Translation
    18th China Conference, CCMT 2022, Lhasa, China, August 6-10, 2022, revised selected papers
    Beteiligt: Pino, Juan (HerausgeberIn); Xiao, Tong (HerausgeberIn)
    Erschienen: [2022]; ©2022
    Verlag:  Springer, Singapore

    Intro -- Preface -- Organization -- Contents -- PEACook: Post-editing Advancement Cookbook -- 1 Introduction -- 2 Related Work -- 2.1 APE Problem and APE Metrics -- 2.2 APE Baselines -- 3 PEACook Corpus -- 3.1 PEACook Corpus Details -- 4 Baseline... mehr

    Zugang:
    Aggregator (lizenzpflichtig)
    Staatsbibliothek zu Berlin - Preußischer Kulturbesitz, Haus Unter den Linden
    uneingeschränkte Fernleihe, Kopie und Ausleihe

     

    Intro -- Preface -- Organization -- Contents -- PEACook: Post-editing Advancement Cookbook -- 1 Introduction -- 2 Related Work -- 2.1 APE Problem and APE Metrics -- 2.2 APE Baselines -- 3 PEACook Corpus -- 3.1 PEACook Corpus Details -- 4 Baseline Model Experiments -- 4.1 Pre-training AR-APE Model -- 4.2 Fine-Tuning AR-APE Model -- 4.3 Pre-training NAR-APE Model -- 4.4 Fine-Tuning NAR-APE Model -- 5 Conclusion -- References -- Hot-Start Transfer Learning Combined with Approximate Distillation for Mongolian-Chinese Neural Machine Translation -- 1 Introduction -- 2 Background -- 2.1 NMT -- 2.2 Transfer Learning -- 2.3 Pre-train Techniques -- 3 Methods -- 3.1 Word Alignment Under Hot-Start -- 3.2 Approximate Distillation -- 4 Experiment -- 4.1 Settings -- 4.2 Results and Analysis -- 4.3 Ablation Test -- 4.4 Case Analysis -- 5 Conclusion -- References -- Review-Based Curriculum Learning for Neural Machine Translation -- 1 Introduction -- 2 Related Work -- 3 Review-Based Curriculum Learning -- 3.1 Time-Based Review Method -- 3.2 Master-Based Review Method -- 3.3 General Domain Enhanced Training -- 4 Experiment -- 4.1 Data and Setup -- 4.2 Main Results -- 5 Analysis -- 5.1 Effect of Mixed Fine Tuning -- 5.2 Low-Resource Scenario -- 5.3 Data Sharding -- 5.4 Training Efficiency -- 6 Conclusion -- References -- Multi-strategy Enhanced Neural Machine Translation for Chinese Minority Languages -- 1 Introduction -- 2 Dataset -- 3 System Overview -- 3.1 Back-Translation -- 3.2 Alternated Training -- 3.3 Ensemble -- 4 Experiments -- 4.1 Mongolian Chinese -- 4.2 TibetanChinese -- 4.3 UyghurChinese -- 5 Analysis -- 5.1 The Effect of Different Back-Translation Methods -- 5.2 The Impact of Sentence Segmentation on the Translation Quality of Machine Translation -- 5.3 Analysis of BLEU Scores of MongolianChinese Machine Translation on the Development Set.

     

    Export in Literaturverwaltung   RIS-Format
      BibTeX-Format
    Quelle: Staatsbibliothek zu Berlin
    Beteiligt: Pino, Juan (HerausgeberIn); Xiao, Tong (HerausgeberIn)
    Sprache: Englisch
    Medientyp: Konferenzschrift
    Format: Online
    ISBN: 9789811979606
    Körperschaften/Kongresse: CCMT, 18. (2022, Lhasa)
    Schriftenreihe: Communications in Computer and Information Science ; 1671
    Schlagworte: Chinese language-Machine translating; Machine translating-Congresses; COMPUTERS / Data Processing / General; COMPUTERS / Data Processing / Speech & Audio Processing; COMPUTERS / Database Management / General; COMPUTERS / Information Theory; COMPUTERS / Programming / General; Coding theory & cryptology; Computer programming / software development; Computer-Anwendungen in den Sozial- und Verhaltenswissenschaften; Databases; Datenbanken; Informationstheorie; Kodierungstheorie und Verschlüsselung (Kryptologie); Natural language & machine translation; Natürliche Sprachen und maschinelle Übersetzung; Society & social sciences; Theoretische Informatik
    Umfang: 1 Online-Ressource (xiv, 160 Seiten), Illustrationen, Diagramme
    Bemerkung(en):

    Description based on publisher supplied metadata and other sources

  2. Machine Translation
    18th China Conference, CCMT 2022, Lhasa, China, August 6-10, 2022, revised selected papers
    Beteiligt: Pino, Juan (HerausgeberIn); Xiao, Tong (HerausgeberIn)
    Erschienen: [2022]; ©2022
    Verlag:  Springer, Singapore

    Intro -- Preface -- Organization -- Contents -- PEACook: Post-editing Advancement Cookbook -- 1 Introduction -- 2 Related Work -- 2.1 APE Problem and APE Metrics -- 2.2 APE Baselines -- 3 PEACook Corpus -- 3.1 PEACook Corpus Details -- 4 Baseline... mehr

    Zugang:
    Aggregator (lizenzpflichtig)
    Staatsbibliothek zu Berlin - Preußischer Kulturbesitz, Haus Potsdamer Straße
    keine Fernleihe
    Universitätsbibliothek Clausthal
    keine Fernleihe
    Hochschulbibliothek Friedensau
    Online-Ressource
    keine Fernleihe
    Universität Ulm, Kommunikations- und Informationszentrum, Bibliotheksservices
    keine Fernleihe

     

    Intro -- Preface -- Organization -- Contents -- PEACook: Post-editing Advancement Cookbook -- 1 Introduction -- 2 Related Work -- 2.1 APE Problem and APE Metrics -- 2.2 APE Baselines -- 3 PEACook Corpus -- 3.1 PEACook Corpus Details -- 4 Baseline Model Experiments -- 4.1 Pre-training AR-APE Model -- 4.2 Fine-Tuning AR-APE Model -- 4.3 Pre-training NAR-APE Model -- 4.4 Fine-Tuning NAR-APE Model -- 5 Conclusion -- References -- Hot-Start Transfer Learning Combined with Approximate Distillation for Mongolian-Chinese Neural Machine Translation -- 1 Introduction -- 2 Background -- 2.1 NMT -- 2.2 Transfer Learning -- 2.3 Pre-train Techniques -- 3 Methods -- 3.1 Word Alignment Under Hot-Start -- 3.2 Approximate Distillation -- 4 Experiment -- 4.1 Settings -- 4.2 Results and Analysis -- 4.3 Ablation Test -- 4.4 Case Analysis -- 5 Conclusion -- References -- Review-Based Curriculum Learning for Neural Machine Translation -- 1 Introduction -- 2 Related Work -- 3 Review-Based Curriculum Learning -- 3.1 Time-Based Review Method -- 3.2 Master-Based Review Method -- 3.3 General Domain Enhanced Training -- 4 Experiment -- 4.1 Data and Setup -- 4.2 Main Results -- 5 Analysis -- 5.1 Effect of Mixed Fine Tuning -- 5.2 Low-Resource Scenario -- 5.3 Data Sharding -- 5.4 Training Efficiency -- 6 Conclusion -- References -- Multi-strategy Enhanced Neural Machine Translation for Chinese Minority Languages -- 1 Introduction -- 2 Dataset -- 3 System Overview -- 3.1 Back-Translation -- 3.2 Alternated Training -- 3.3 Ensemble -- 4 Experiments -- 4.1 Mongolian Chinese -- 4.2 TibetanChinese -- 4.3 UyghurChinese -- 5 Analysis -- 5.1 The Effect of Different Back-Translation Methods -- 5.2 The Impact of Sentence Segmentation on the Translation Quality of Machine Translation -- 5.3 Analysis of BLEU Scores of MongolianChinese Machine Translation on the Development Set.

     

    Export in Literaturverwaltung   RIS-Format
      BibTeX-Format
    Quelle: Staatsbibliothek zu Berlin
    Beteiligt: Pino, Juan (HerausgeberIn); Xiao, Tong (HerausgeberIn)
    Sprache: Englisch
    Medientyp: Konferenzschrift
    Format: Online
    ISBN: 9789811979606
    Körperschaften/Kongresse: CCMT, 18. (2022, Lhasa)
    Schriftenreihe: Communications in Computer and Information Science ; 1671
    Schlagworte: Chinese language-Machine translating; Machine translating-Congresses; COMPUTERS / Data Processing / General; COMPUTERS / Data Processing / Speech & Audio Processing; COMPUTERS / Database Management / General; COMPUTERS / Information Theory; COMPUTERS / Programming / General; Coding theory & cryptology; Computer programming / software development; Computer-Anwendungen in den Sozial- und Verhaltenswissenschaften; Databases; Datenbanken; Informationstheorie; Kodierungstheorie und Verschlüsselung (Kryptologie); Natural language & machine translation; Natürliche Sprachen und maschinelle Übersetzung; Society & social sciences; Theoretische Informatik
    Umfang: 1 Online-Ressource (xiv, 160 Seiten), Illustrationen, Diagramme
    Bemerkung(en):

    Description based on publisher supplied metadata and other sources