Limits...
A compression algorithm for the combination of PDF sets.

Carrazza S, Latorre JI, Rojo J, Watt G - Eur Phys J C Part Fields (2015)

Bottom Line: We illustrate our strategy with the combination and compression of the recent NNPDF3.0, CT14 and MMHT14 NNLO PDF sets.The resulting compressed Monte Carlo PDF sets are validated at the level of parton luminosities and LHC inclusive cross sections and differential distributions.We determine that around 100 replicas provide an adequate representation of the probability distribution for the original combined PDF set, suitable for general applications to LHC phenomenology.

View Article: PubMed Central - PubMed

Affiliation: Dipartimento di Fisica, Università di Milano and INFN, Sezione di Milano, Via Celoria 16, 20133 Milan, Italy.

ABSTRACT

The current PDF4LHC recommendation to estimate uncertainties due to parton distribution functions (PDFs) in theoretical predictions for LHC processes involves the combination of separate predictions computed using PDF sets from different groups, each of which comprises a relatively large number of either Hessian eigenvectors or Monte Carlo (MC) replicas. While many fixed-order and parton shower programs allow the evaluation of PDF uncertainties for a single PDF set at no additional CPU cost, this feature is not universal, and, moreover, the a posteriori combination of the predictions using at least three different PDF sets is still required. In this work, we present a strategy for the statistical combination of individual PDF sets, based on the MC representation of Hessian sets, followed by a compression algorithm for the reduction of the number of MC replicas. We illustrate our strategy with the combination and compression of the recent NNPDF3.0, CT14 and MMHT14 NNLO PDF sets. The resulting compressed Monte Carlo PDF sets are validated at the level of parton luminosities and LHC inclusive cross sections and differential distributions. We determine that around 100 replicas provide an adequate representation of the probability distribution for the original combined PDF set, suitable for general applications to LHC phenomenology.

No MeSH data available.


Same as Fig. 7 for the CMC-PDFs, starting from the prior with  replicas
© Copyright Policy - OpenAccess
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4594162&req=5

Fig13: Same as Fig. 7 for the CMC-PDFs, starting from the prior with replicas

Mentions: We have verified that the performance of the compression algorithm is similar regardless of the prior. To illustrate this point, in Fig. 13 we show the corresponding version of Fig. 7, namely the various contributions to the error function, for the case of compression of the CMC-PDF sets. We see that also in the case of the CMC-PDF sets the compression improves the ERF as compared to random selections by an order of magnitude or even more.


A compression algorithm for the combination of PDF sets.

Carrazza S, Latorre JI, Rojo J, Watt G - Eur Phys J C Part Fields (2015)

Same as Fig. 7 for the CMC-PDFs, starting from the prior with  replicas
© Copyright Policy - OpenAccess
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4594162&req=5

Fig13: Same as Fig. 7 for the CMC-PDFs, starting from the prior with replicas
Mentions: We have verified that the performance of the compression algorithm is similar regardless of the prior. To illustrate this point, in Fig. 13 we show the corresponding version of Fig. 7, namely the various contributions to the error function, for the case of compression of the CMC-PDF sets. We see that also in the case of the CMC-PDF sets the compression improves the ERF as compared to random selections by an order of magnitude or even more.

Bottom Line: We illustrate our strategy with the combination and compression of the recent NNPDF3.0, CT14 and MMHT14 NNLO PDF sets.The resulting compressed Monte Carlo PDF sets are validated at the level of parton luminosities and LHC inclusive cross sections and differential distributions.We determine that around 100 replicas provide an adequate representation of the probability distribution for the original combined PDF set, suitable for general applications to LHC phenomenology.

View Article: PubMed Central - PubMed

Affiliation: Dipartimento di Fisica, Università di Milano and INFN, Sezione di Milano, Via Celoria 16, 20133 Milan, Italy.

ABSTRACT

The current PDF4LHC recommendation to estimate uncertainties due to parton distribution functions (PDFs) in theoretical predictions for LHC processes involves the combination of separate predictions computed using PDF sets from different groups, each of which comprises a relatively large number of either Hessian eigenvectors or Monte Carlo (MC) replicas. While many fixed-order and parton shower programs allow the evaluation of PDF uncertainties for a single PDF set at no additional CPU cost, this feature is not universal, and, moreover, the a posteriori combination of the predictions using at least three different PDF sets is still required. In this work, we present a strategy for the statistical combination of individual PDF sets, based on the MC representation of Hessian sets, followed by a compression algorithm for the reduction of the number of MC replicas. We illustrate our strategy with the combination and compression of the recent NNPDF3.0, CT14 and MMHT14 NNLO PDF sets. The resulting compressed Monte Carlo PDF sets are validated at the level of parton luminosities and LHC inclusive cross sections and differential distributions. We determine that around 100 replicas provide an adequate representation of the probability distribution for the original combined PDF set, suitable for general applications to LHC phenomenology.

No MeSH data available.