Apply to our new Data Science and Cybersecurity Part-time cohorts

The Role of Length Normalization in Beam Search and Sequence Generation

Length normalization in NLP
Beam search technique
Sequence generation fairness
Ensuring Fairness in NLP: Understanding Length Normalization in Sequence Generation cover image

Length normalization is a technique used in beam search or other sequence generation algorithms to address biases towards shorter or longer sequences. It aims to ensure fair evaluation and ranking of sequences of different lengths, especially when using probability-based scoring methods.

In the context of beam search:

Problem Addressed

  • Length Biases: Without length normalization, longer sequences tend to have lower probabilities compared to shorter sequences, merely due to the multiplication of probabilities at each step. As a result, shorter sequences often dominate in beam search due to their higher probability of occurrence.

How Length Normalization Works

  • Objective: The goal of length normalization is to adjust the scores or probabilities of candidate sequences based on their lengths to prevent bias towards any particular length.

  • Normalization Factor: It involves scaling the scores of sequences by a factor that takes into account their lengths.

  • Length Penalization: Usually, this involves dividing the log-probability (or any scoring metric) by the length of the sequence or applying a penalty term that is inversely proportional to sequence length.

Example

  • Suppose you have two sequences: Sequence A has a length of 5 and a log-probability of -10, and Sequence B has a length of 7 and a log-probability of -15.

  • Without length normalization, Sequence A appears to have a higher probability (since -10 > -15), even though it's shorter.

  • With length normalization, the scores might be adjusted by dividing the log-probabilities by their respective sequence lengths: Sequence A's adjusted score becomes -10/5 = -2, and Sequence B's adjusted score becomes -15/7 ≈ -2.14.

  • After length normalization, Sequence B might have a slightly higher adjusted probability, considering its longer length.

Purpose and Impact

  • Equal Evaluation: Length normalization aims to ensure fair evaluation and ranking of sequences by considering their lengths, mitigating the bias towards shorter sequences.

  • Balanced Exploration: By normalizing the scores based on length, beam search can explore sequences of varying lengths more evenly, encouraging diversity in generated outputs.

Importance in Sequence Generation

  • Length normalization is particularly crucial in tasks where the length of the output sequence significantly varies or where favoring shorter or longer sequences might lead to biased results.

  • It helps in striking a balance between generating concise, coherent outputs and exploring longer, more contextually rich sequences.

In essence, length normalization in beam search adjusts the scores of candidate sequences based on their lengths to ensure a fair comparison and ranking, promoting a more balanced exploration of sequences of different lengths.


Career Services background pattern

Career Services

Contact Section background image

Let’s stay in touch

Code Labs Academy © 2024 All rights reserved.