Fine-tune and deploy a summarizer model using the Hugging Face Amazon SageMaker containers bringing your own script
AWS Machine Learning
JULY 29, 2022
BERT is pre-trained on masking random words in a sentence; in contrast, during Pegasus’s pre-training, sentences are masked from an input document. The model then generates the missing sentences as a single output sequence using all the unmasked sentences as context, creating an executive summary of the document as a result.
Let's personalize your content