Narrow Search
Last searches

Results for *

Displaying results 1 to 3 of 3.

  1. Machine Translation
    18th China Conference, CCMT 2022, Lhasa, China, August 6-10, 2022, revised selected papers
    Contributor: Pino, Juan (HerausgeberIn); Xiao, Tong (HerausgeberIn)
    Published: [2022]; ©2022
    Publisher:  Springer, Singapore

    Intro -- Preface -- Organization -- Contents -- PEACook: Post-editing Advancement Cookbook -- 1 Introduction -- 2 Related Work -- 2.1 APE Problem and APE Metrics -- 2.2 APE Baselines -- 3 PEACook Corpus -- 3.1 PEACook Corpus Details -- 4 Baseline... more

    Access:
    Aggregator (lizenzpflichtig)
    Staatsbibliothek zu Berlin - Preußischer Kulturbesitz, Haus Potsdamer Straße
    No inter-library loan
    Universitätsbibliothek Clausthal
    No inter-library loan
    Hochschulbibliothek Friedensau
    Online-Ressource
    No inter-library loan
    Hochschule Schmalkalden, Cellarius Bibliothek
    No inter-library loan
    Universität Ulm, Kommunikations- und Informationszentrum, Bibliotheksservices
    No inter-library loan

     

    Intro -- Preface -- Organization -- Contents -- PEACook: Post-editing Advancement Cookbook -- 1 Introduction -- 2 Related Work -- 2.1 APE Problem and APE Metrics -- 2.2 APE Baselines -- 3 PEACook Corpus -- 3.1 PEACook Corpus Details -- 4 Baseline Model Experiments -- 4.1 Pre-training AR-APE Model -- 4.2 Fine-Tuning AR-APE Model -- 4.3 Pre-training NAR-APE Model -- 4.4 Fine-Tuning NAR-APE Model -- 5 Conclusion -- References -- Hot-Start Transfer Learning Combined with Approximate Distillation for Mongolian-Chinese Neural Machine Translation -- 1 Introduction -- 2 Background -- 2.1 NMT -- 2.2 Transfer Learning -- 2.3 Pre-train Techniques -- 3 Methods -- 3.1 Word Alignment Under Hot-Start -- 3.2 Approximate Distillation -- 4 Experiment -- 4.1 Settings -- 4.2 Results and Analysis -- 4.3 Ablation Test -- 4.4 Case Analysis -- 5 Conclusion -- References -- Review-Based Curriculum Learning for Neural Machine Translation -- 1 Introduction -- 2 Related Work -- 3 Review-Based Curriculum Learning -- 3.1 Time-Based Review Method -- 3.2 Master-Based Review Method -- 3.3 General Domain Enhanced Training -- 4 Experiment -- 4.1 Data and Setup -- 4.2 Main Results -- 5 Analysis -- 5.1 Effect of Mixed Fine Tuning -- 5.2 Low-Resource Scenario -- 5.3 Data Sharding -- 5.4 Training Efficiency -- 6 Conclusion -- References -- Multi-strategy Enhanced Neural Machine Translation for Chinese Minority Languages -- 1 Introduction -- 2 Dataset -- 3 System Overview -- 3.1 Back-Translation -- 3.2 Alternated Training -- 3.3 Ensemble -- 4 Experiments -- 4.1 Mongolian Chinese -- 4.2 TibetanChinese -- 4.3 UyghurChinese -- 5 Analysis -- 5.1 The Effect of Different Back-Translation Methods -- 5.2 The Impact of Sentence Segmentation on the Translation Quality of Machine Translation -- 5.3 Analysis of BLEU Scores of MongolianChinese Machine Translation on the Development Set.

     

    Export to reference management software   RIS file
      BibTeX file
  2. Machine Translation
    18th China Conference, CCMT 2022, Lhasa, China, August 6-10, 2022, revised selected papers
    Contributor: Pino, Juan (HerausgeberIn); Xiao, Tong (HerausgeberIn)
    Published: [2022]; ©2022
    Publisher:  Springer, Singapore

    Intro -- Preface -- Organization -- Contents -- PEACook: Post-editing Advancement Cookbook -- 1 Introduction -- 2 Related Work -- 2.1 APE Problem and APE Metrics -- 2.2 APE Baselines -- 3 PEACook Corpus -- 3.1 PEACook Corpus Details -- 4 Baseline... more

    Access:
    Aggregator (lizenzpflichtig)
    Staatsbibliothek zu Berlin - Preußischer Kulturbesitz, Haus Unter den Linden
    Unlimited inter-library loan, copies and loan

     

    Intro -- Preface -- Organization -- Contents -- PEACook: Post-editing Advancement Cookbook -- 1 Introduction -- 2 Related Work -- 2.1 APE Problem and APE Metrics -- 2.2 APE Baselines -- 3 PEACook Corpus -- 3.1 PEACook Corpus Details -- 4 Baseline Model Experiments -- 4.1 Pre-training AR-APE Model -- 4.2 Fine-Tuning AR-APE Model -- 4.3 Pre-training NAR-APE Model -- 4.4 Fine-Tuning NAR-APE Model -- 5 Conclusion -- References -- Hot-Start Transfer Learning Combined with Approximate Distillation for Mongolian-Chinese Neural Machine Translation -- 1 Introduction -- 2 Background -- 2.1 NMT -- 2.2 Transfer Learning -- 2.3 Pre-train Techniques -- 3 Methods -- 3.1 Word Alignment Under Hot-Start -- 3.2 Approximate Distillation -- 4 Experiment -- 4.1 Settings -- 4.2 Results and Analysis -- 4.3 Ablation Test -- 4.4 Case Analysis -- 5 Conclusion -- References -- Review-Based Curriculum Learning for Neural Machine Translation -- 1 Introduction -- 2 Related Work -- 3 Review-Based Curriculum Learning -- 3.1 Time-Based Review Method -- 3.2 Master-Based Review Method -- 3.3 General Domain Enhanced Training -- 4 Experiment -- 4.1 Data and Setup -- 4.2 Main Results -- 5 Analysis -- 5.1 Effect of Mixed Fine Tuning -- 5.2 Low-Resource Scenario -- 5.3 Data Sharding -- 5.4 Training Efficiency -- 6 Conclusion -- References -- Multi-strategy Enhanced Neural Machine Translation for Chinese Minority Languages -- 1 Introduction -- 2 Dataset -- 3 System Overview -- 3.1 Back-Translation -- 3.2 Alternated Training -- 3.3 Ensemble -- 4 Experiments -- 4.1 Mongolian Chinese -- 4.2 TibetanChinese -- 4.3 UyghurChinese -- 5 Analysis -- 5.1 The Effect of Different Back-Translation Methods -- 5.2 The Impact of Sentence Segmentation on the Translation Quality of Machine Translation -- 5.3 Analysis of BLEU Scores of MongolianChinese Machine Translation on the Development Set.

     

    Export to reference management software   RIS file
      BibTeX file
  3. Machine Translation
    18th China Conference, CCMT 2022, Lhasa, China, August 6-10, 2022, revised selected papers
    Contributor: Pino, Juan (HerausgeberIn); Xiao, Tong (HerausgeberIn)
    Published: [2022]
    Publisher:  Springer, Singapore

    This book constitutes the refereed proceedings of the 18th China Conference onMachine Translation, CCMT 2022, held in Lhasa, China, during August 6-10, 2022.The 16 full papers were included in this book were carefully reviewed and selected from 73... more

    Technische Informationsbibliothek (TIB) / Leibniz-Informationszentrum Technik und Naturwissenschaften und Universitätsbibliothek
    RS 7445(1671)
    No loan of volumes, only paper copies will be sent

     

    This book constitutes the refereed proceedings of the 18th China Conference onMachine Translation, CCMT 2022, held in Lhasa, China, during August 6-10, 2022.The 16 full papers were included in this book were carefully reviewed and selected from 73 submissions

     

    Export to reference management software   RIS file
      BibTeX file
    Content information
    Source: Union catalogues
    Contributor: Pino, Juan (HerausgeberIn); Xiao, Tong (HerausgeberIn)
    Language: English
    Media type: Conference proceedings
    Format: Print
    ISBN: 9789811979590
    Corporations / Congresses: CCMT, 18. (2022, Lhasa)
    Series: Communications in Computer and Information Science ; 1671
    Subjects: COMPUTERS / Data Processing / General; COMPUTERS / Data Processing / Speech & Audio Processing; COMPUTERS / Database Management / General; COMPUTERS / Information Theory; COMPUTERS / Programming / General; Coding theory & cryptology; Computer programming / software development; Computer-Anwendungen in den Sozial- und Verhaltenswissenschaften; Databases; Datenbanken; Informationstheorie; Kodierungstheorie und Verschlüsselung (Kryptologie); Natural language & machine translation; Natürliche Sprachen und maschinelle Übersetzung; Society & social sciences; Theoretische Informatik
    Scope: xiv, 160 Seiten, Illustrationen, Diagramme
    Notes:

    Interessenniveau: 06, Professional and scholarly: For an expert adult audience, including academic research. (06)

    PEACook: Post-Editing Advancement Cookbook.- Hot-start Transfer Learning combined with Approximate Distillation for Mongolian- Chinese Neural Machine Translation.- Review-based Curriculum Learning for Neural Machine Translation.- Multi-Strategy Enhanced Neural Machine Translation for Chinese Minority Language.- Target-side Language Model for Reference-free Machine Translation Evaluation.- Life Is Short, Train It Less: Neural Machine Tibetan-Chinese Translation Based on mRASP and Dataset Enhancement.- Improving the Robustness of Low-Resource Neural Machine Translation with Adversarial Examples.- Dynamic Mask Curriculum Learning for Non-Autoregressive Neural Machine Translation.- Dynamic Fusion Nearest Neighbor Machine Translation via Dempster-Shafer Theory.- A Multi-tasking and Multi-stage Chinese Minority Pre-Trained Language Model.- An improved Multi-task Approach to Pre-trained Model Based MT Quality Estimation.- Optimizing Deep Transformers for Chinese-Thai Low-Resource Translation.- HW-TSC Submission for CCMT 2022 Translation Quality Estimation Task.- Effective Data Augmentation Methods for CCMT 2022.- NJUNLP's Submission for CCMT 2022 Quality Estimation Task.- ISTIC's Thai-to-Chinese Neural Machine Translation System for CCMT' 2022.