Limits...
A compression algorithm for the combination of PDF sets.

Carrazza S, Latorre JI, Rojo J, Watt G - Eur Phys J C Part Fields (2015)

Bottom Line: We illustrate our strategy with the combination and compression of the recent NNPDF3.0, CT14 and MMHT14 NNLO PDF sets.The resulting compressed Monte Carlo PDF sets are validated at the level of parton luminosities and LHC inclusive cross sections and differential distributions.We determine that around 100 replicas provide an adequate representation of the probability distribution for the original combined PDF set, suitable for general applications to LHC phenomenology.

View Article: PubMed Central - PubMed

Affiliation: Dipartimento di Fisica, Università di Milano and INFN, Sezione di Milano, Via Celoria 16, 20133 Milan, Italy.

ABSTRACT

The current PDF4LHC recommendation to estimate uncertainties due to parton distribution functions (PDFs) in theoretical predictions for LHC processes involves the combination of separate predictions computed using PDF sets from different groups, each of which comprises a relatively large number of either Hessian eigenvectors or Monte Carlo (MC) replicas. While many fixed-order and parton shower programs allow the evaluation of PDF uncertainties for a single PDF set at no additional CPU cost, this feature is not universal, and, moreover, the a posteriori combination of the predictions using at least three different PDF sets is still required. In this work, we present a strategy for the statistical combination of individual PDF sets, based on the MC representation of Hessian sets, followed by a compression algorithm for the reduction of the number of MC replicas. We illustrate our strategy with the combination and compression of the recent NNPDF3.0, CT14 and MMHT14 NNLO PDF sets. The resulting compressed Monte Carlo PDF sets are validated at the level of parton luminosities and LHC inclusive cross sections and differential distributions. We determine that around 100 replicas provide an adequate representation of the probability distribution for the original combined PDF set, suitable for general applications to LHC phenomenology.

No MeSH data available.


Histograms representing the probability distribution of Monte Carlo replicas for both the individual PDF sets and for the combined set, for different flavors and values of (x, Q). From top to bottom and from left to right we show the gluon at , the up quark at , the down antiquark for , and the strange PDF for . All PDFs have been evaluated at  GeV. A Gaussian distribution computed with from the mean and variance of the MC900 prior is also shown
© Copyright Policy - OpenAccess
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4594162&req=5

Fig2: Histograms representing the probability distribution of Monte Carlo replicas for both the individual PDF sets and for the combined set, for different flavors and values of (x, Q). From top to bottom and from left to right we show the gluon at , the up quark at , the down antiquark for , and the strange PDF for . All PDFs have been evaluated at  GeV. A Gaussian distribution computed with from the mean and variance of the MC900 prior is also shown

Mentions: In Fig. 2 we show the histograms representing the distribution of Monte Carlo replicas in the individual PDF sets and in the combined set, for different flavors and values of (x, Q). From top to bottom and from left to right we show the gluon at (relevant for Higgs production in gluon fusion), the up quark at (at the lower edge of the region covered by HERA data), the down antiquark for (relevant for high-mass searches) and the strange PDF for (accessible at the LHC through W+charm production). All PDFs have been evaluated at GeV.


A compression algorithm for the combination of PDF sets.

Carrazza S, Latorre JI, Rojo J, Watt G - Eur Phys J C Part Fields (2015)

Histograms representing the probability distribution of Monte Carlo replicas for both the individual PDF sets and for the combined set, for different flavors and values of (x, Q). From top to bottom and from left to right we show the gluon at , the up quark at , the down antiquark for , and the strange PDF for . All PDFs have been evaluated at  GeV. A Gaussian distribution computed with from the mean and variance of the MC900 prior is also shown
© Copyright Policy - OpenAccess
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4594162&req=5

Fig2: Histograms representing the probability distribution of Monte Carlo replicas for both the individual PDF sets and for the combined set, for different flavors and values of (x, Q). From top to bottom and from left to right we show the gluon at , the up quark at , the down antiquark for , and the strange PDF for . All PDFs have been evaluated at  GeV. A Gaussian distribution computed with from the mean and variance of the MC900 prior is also shown
Mentions: In Fig. 2 we show the histograms representing the distribution of Monte Carlo replicas in the individual PDF sets and in the combined set, for different flavors and values of (x, Q). From top to bottom and from left to right we show the gluon at (relevant for Higgs production in gluon fusion), the up quark at (at the lower edge of the region covered by HERA data), the down antiquark for (relevant for high-mass searches) and the strange PDF for (accessible at the LHC through W+charm production). All PDFs have been evaluated at GeV.

Bottom Line: We illustrate our strategy with the combination and compression of the recent NNPDF3.0, CT14 and MMHT14 NNLO PDF sets.The resulting compressed Monte Carlo PDF sets are validated at the level of parton luminosities and LHC inclusive cross sections and differential distributions.We determine that around 100 replicas provide an adequate representation of the probability distribution for the original combined PDF set, suitable for general applications to LHC phenomenology.

View Article: PubMed Central - PubMed

Affiliation: Dipartimento di Fisica, Università di Milano and INFN, Sezione di Milano, Via Celoria 16, 20133 Milan, Italy.

ABSTRACT

The current PDF4LHC recommendation to estimate uncertainties due to parton distribution functions (PDFs) in theoretical predictions for LHC processes involves the combination of separate predictions computed using PDF sets from different groups, each of which comprises a relatively large number of either Hessian eigenvectors or Monte Carlo (MC) replicas. While many fixed-order and parton shower programs allow the evaluation of PDF uncertainties for a single PDF set at no additional CPU cost, this feature is not universal, and, moreover, the a posteriori combination of the predictions using at least three different PDF sets is still required. In this work, we present a strategy for the statistical combination of individual PDF sets, based on the MC representation of Hessian sets, followed by a compression algorithm for the reduction of the number of MC replicas. We illustrate our strategy with the combination and compression of the recent NNPDF3.0, CT14 and MMHT14 NNLO PDF sets. The resulting compressed Monte Carlo PDF sets are validated at the level of parton luminosities and LHC inclusive cross sections and differential distributions. We determine that around 100 replicas provide an adequate representation of the probability distribution for the original combined PDF set, suitable for general applications to LHC phenomenology.

No MeSH data available.