Limits...
A compression algorithm for the combination of PDF sets.

Carrazza S, Latorre JI, Rojo J, Watt G - Eur Phys J C Part Fields (2015)

Bottom Line: We illustrate our strategy with the combination and compression of the recent NNPDF3.0, CT14 and MMHT14 NNLO PDF sets.The resulting compressed Monte Carlo PDF sets are validated at the level of parton luminosities and LHC inclusive cross sections and differential distributions.We determine that around 100 replicas provide an adequate representation of the probability distribution for the original combined PDF set, suitable for general applications to LHC phenomenology.

View Article: PubMed Central - PubMed

Affiliation: Dipartimento di Fisica, Università di Milano and INFN, Sezione di Milano, Via Celoria 16, 20133 Milan, Italy.

ABSTRACT

The current PDF4LHC recommendation to estimate uncertainties due to parton distribution functions (PDFs) in theoretical predictions for LHC processes involves the combination of separate predictions computed using PDF sets from different groups, each of which comprises a relatively large number of either Hessian eigenvectors or Monte Carlo (MC) replicas. While many fixed-order and parton shower programs allow the evaluation of PDF uncertainties for a single PDF set at no additional CPU cost, this feature is not universal, and, moreover, the a posteriori combination of the predictions using at least three different PDF sets is still required. In this work, we present a strategy for the statistical combination of individual PDF sets, based on the MC representation of Hessian sets, followed by a compression algorithm for the reduction of the number of MC replicas. We illustrate our strategy with the combination and compression of the recent NNPDF3.0, CT14 and MMHT14 NNLO PDF sets. The resulting compressed Monte Carlo PDF sets are validated at the level of parton luminosities and LHC inclusive cross sections and differential distributions. We determine that around 100 replicas provide an adequate representation of the probability distribution for the original combined PDF set, suitable for general applications to LHC phenomenology.

No MeSH data available.


The probability distribution for two LHC cross sections: the CMS W+charm production in the most forward bin (left plot) and the LHCb  rapidity distribution for  (right plot). We compare the original prior MC900 with the results from the CMC-PDF100 and MCH100 reduced sets
© Copyright Policy - OpenAccess
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4594162&req=5

Fig23: The probability distribution for two LHC cross sections: the CMS W+charm production in the most forward bin (left plot) and the LHCb rapidity distribution for (right plot). We compare the original prior MC900 with the results from the CMC-PDF100 and MCH100 reduced sets

Mentions: In Fig. 23 we compare the probability distributions, obtained using the KDE method, for two LHC cross sections: the CMS W+charm production in the most forward bin (left plot) and the LHCb rapidity distribution for (right plot). We compare the original prior MC900 with the CMC-PDF100 and MCH100 reduced sets. In the case of the W+charm cross section, which is directly sensitive to the poorly known strange PDF, the prior shows a double-hump structure, which is reasonably well reproduced by the CMC-PDF100 set, but that disappears if a Gaussian reduction, in this case MCH100, is used. For the LHCb forward Z production, both the prior and CMC-PDF100 are significantly skewed, a feature which is lost in the Gaussian reduction of MCH100.


A compression algorithm for the combination of PDF sets.

Carrazza S, Latorre JI, Rojo J, Watt G - Eur Phys J C Part Fields (2015)

The probability distribution for two LHC cross sections: the CMS W+charm production in the most forward bin (left plot) and the LHCb  rapidity distribution for  (right plot). We compare the original prior MC900 with the results from the CMC-PDF100 and MCH100 reduced sets
© Copyright Policy - OpenAccess
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4594162&req=5

Fig23: The probability distribution for two LHC cross sections: the CMS W+charm production in the most forward bin (left plot) and the LHCb rapidity distribution for (right plot). We compare the original prior MC900 with the results from the CMC-PDF100 and MCH100 reduced sets
Mentions: In Fig. 23 we compare the probability distributions, obtained using the KDE method, for two LHC cross sections: the CMS W+charm production in the most forward bin (left plot) and the LHCb rapidity distribution for (right plot). We compare the original prior MC900 with the CMC-PDF100 and MCH100 reduced sets. In the case of the W+charm cross section, which is directly sensitive to the poorly known strange PDF, the prior shows a double-hump structure, which is reasonably well reproduced by the CMC-PDF100 set, but that disappears if a Gaussian reduction, in this case MCH100, is used. For the LHCb forward Z production, both the prior and CMC-PDF100 are significantly skewed, a feature which is lost in the Gaussian reduction of MCH100.

Bottom Line: We illustrate our strategy with the combination and compression of the recent NNPDF3.0, CT14 and MMHT14 NNLO PDF sets.The resulting compressed Monte Carlo PDF sets are validated at the level of parton luminosities and LHC inclusive cross sections and differential distributions.We determine that around 100 replicas provide an adequate representation of the probability distribution for the original combined PDF set, suitable for general applications to LHC phenomenology.

View Article: PubMed Central - PubMed

Affiliation: Dipartimento di Fisica, Università di Milano and INFN, Sezione di Milano, Via Celoria 16, 20133 Milan, Italy.

ABSTRACT

The current PDF4LHC recommendation to estimate uncertainties due to parton distribution functions (PDFs) in theoretical predictions for LHC processes involves the combination of separate predictions computed using PDF sets from different groups, each of which comprises a relatively large number of either Hessian eigenvectors or Monte Carlo (MC) replicas. While many fixed-order and parton shower programs allow the evaluation of PDF uncertainties for a single PDF set at no additional CPU cost, this feature is not universal, and, moreover, the a posteriori combination of the predictions using at least three different PDF sets is still required. In this work, we present a strategy for the statistical combination of individual PDF sets, based on the MC representation of Hessian sets, followed by a compression algorithm for the reduction of the number of MC replicas. We illustrate our strategy with the combination and compression of the recent NNPDF3.0, CT14 and MMHT14 NNLO PDF sets. The resulting compressed Monte Carlo PDF sets are validated at the level of parton luminosities and LHC inclusive cross sections and differential distributions. We determine that around 100 replicas provide an adequate representation of the probability distribution for the original combined PDF set, suitable for general applications to LHC phenomenology.

No MeSH data available.