Limits...
A compression algorithm for the combination of PDF sets.

Carrazza S, Latorre JI, Rojo J, Watt G - Eur Phys J C Part Fields (2015)

Bottom Line: We illustrate our strategy with the combination and compression of the recent NNPDF3.0, CT14 and MMHT14 NNLO PDF sets.The resulting compressed Monte Carlo PDF sets are validated at the level of parton luminosities and LHC inclusive cross sections and differential distributions.We determine that around 100 replicas provide an adequate representation of the probability distribution for the original combined PDF set, suitable for general applications to LHC phenomenology.

View Article: PubMed Central - PubMed

Affiliation: Dipartimento di Fisica, Università di Milano and INFN, Sezione di Milano, Via Celoria 16, 20133 Milan, Italy.

ABSTRACT

The current PDF4LHC recommendation to estimate uncertainties due to parton distribution functions (PDFs) in theoretical predictions for LHC processes involves the combination of separate predictions computed using PDF sets from different groups, each of which comprises a relatively large number of either Hessian eigenvectors or Monte Carlo (MC) replicas. While many fixed-order and parton shower programs allow the evaluation of PDF uncertainties for a single PDF set at no additional CPU cost, this feature is not universal, and, moreover, the a posteriori combination of the predictions using at least three different PDF sets is still required. In this work, we present a strategy for the statistical combination of individual PDF sets, based on the MC representation of Hessian sets, followed by a compression algorithm for the reduction of the number of MC replicas. We illustrate our strategy with the combination and compression of the recent NNPDF3.0, CT14 and MMHT14 NNLO PDF sets. The resulting compressed Monte Carlo PDF sets are validated at the level of parton luminosities and LHC inclusive cross sections and differential distributions. We determine that around 100 replicas provide an adequate representation of the probability distribution for the original combined PDF set, suitable for general applications to LHC phenomenology.

No MeSH data available.


Upper plots comparison of the prior NNPDF3.0 NLO set with  and the compressed set with  replicas, for the gluon and the down quark at the scale  GeV. Lower plots the same comparison this time at a typical LHC scale of  GeV, normalized to the central value of the prior set
© Copyright Policy - OpenAccess
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4594162&req=5

Fig8: Upper plots comparison of the prior NNPDF3.0 NLO set with and the compressed set with replicas, for the gluon and the down quark at the scale  GeV. Lower plots the same comparison this time at a typical LHC scale of GeV, normalized to the central value of the prior set

Mentions: First of all, we show the results for the compression of a native MC PDF set, for the case of the NNPDF3.0 NLO set with replicas. In Fig. 8 we compare the original and the compressed gluon and down quark at GeV, using in the compressed set. Excellent agreement can be seen at the level of central values and variances. The comparison is also shown at a typical LHC scale of GeV, finding similar agreement. The plots in this section have been obtained using the APFEL-Web online PDF plotter [58, 59]. The result that the central values of the original set are perfectly reproduced by the compressed set can also be seen from Fig. 9, where we show the distribution of for all the experiments included in the NNPDF3.0 fit, comparing the original and the compressed PDF set, and find that they are indistinguishable.


A compression algorithm for the combination of PDF sets.

Carrazza S, Latorre JI, Rojo J, Watt G - Eur Phys J C Part Fields (2015)

Upper plots comparison of the prior NNPDF3.0 NLO set with  and the compressed set with  replicas, for the gluon and the down quark at the scale  GeV. Lower plots the same comparison this time at a typical LHC scale of  GeV, normalized to the central value of the prior set
© Copyright Policy - OpenAccess
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4594162&req=5

Fig8: Upper plots comparison of the prior NNPDF3.0 NLO set with and the compressed set with replicas, for the gluon and the down quark at the scale  GeV. Lower plots the same comparison this time at a typical LHC scale of GeV, normalized to the central value of the prior set
Mentions: First of all, we show the results for the compression of a native MC PDF set, for the case of the NNPDF3.0 NLO set with replicas. In Fig. 8 we compare the original and the compressed gluon and down quark at GeV, using in the compressed set. Excellent agreement can be seen at the level of central values and variances. The comparison is also shown at a typical LHC scale of GeV, finding similar agreement. The plots in this section have been obtained using the APFEL-Web online PDF plotter [58, 59]. The result that the central values of the original set are perfectly reproduced by the compressed set can also be seen from Fig. 9, where we show the distribution of for all the experiments included in the NNPDF3.0 fit, comparing the original and the compressed PDF set, and find that they are indistinguishable.

Bottom Line: We illustrate our strategy with the combination and compression of the recent NNPDF3.0, CT14 and MMHT14 NNLO PDF sets.The resulting compressed Monte Carlo PDF sets are validated at the level of parton luminosities and LHC inclusive cross sections and differential distributions.We determine that around 100 replicas provide an adequate representation of the probability distribution for the original combined PDF set, suitable for general applications to LHC phenomenology.

View Article: PubMed Central - PubMed

Affiliation: Dipartimento di Fisica, Università di Milano and INFN, Sezione di Milano, Via Celoria 16, 20133 Milan, Italy.

ABSTRACT

The current PDF4LHC recommendation to estimate uncertainties due to parton distribution functions (PDFs) in theoretical predictions for LHC processes involves the combination of separate predictions computed using PDF sets from different groups, each of which comprises a relatively large number of either Hessian eigenvectors or Monte Carlo (MC) replicas. While many fixed-order and parton shower programs allow the evaluation of PDF uncertainties for a single PDF set at no additional CPU cost, this feature is not universal, and, moreover, the a posteriori combination of the predictions using at least three different PDF sets is still required. In this work, we present a strategy for the statistical combination of individual PDF sets, based on the MC representation of Hessian sets, followed by a compression algorithm for the reduction of the number of MC replicas. We illustrate our strategy with the combination and compression of the recent NNPDF3.0, CT14 and MMHT14 NNLO PDF sets. The resulting compressed Monte Carlo PDF sets are validated at the level of parton luminosities and LHC inclusive cross sections and differential distributions. We determine that around 100 replicas provide an adequate representation of the probability distribution for the original combined PDF set, suitable for general applications to LHC phenomenology.

No MeSH data available.