K2#

class pgmpy.structure_score.K2(data, state_names=None)[source]#

Bases: BaseStructureScore

K2 structure score for discrete Bayesian networks using uniform Dirichlet priors.

The K2 score evaluates a Bayesian network structure on fully discrete data under a Dirichlet prior in which all pseudo-counts are equal to 1. The local score is computed as:

\[\operatorname{K2}(X_i, \Pi_i) = \sum_{j=1}^{q_i} \left[ \log \Gamma(r_i) - \log \Gamma(N_{ij} + r_i) + \sum_{k=1}^{r_i} \log \Gamma(N_{ijk} + 1) \right],\]

where \(r_i\) is the cardinality of \(X_i\), \(q_i\) is the number of parent configurations of \(\Pi_i\), \(N_{ijk}\) is the count of \(X_i = k\) in parent configuration \(j\), and \(N_{ij} = \sum_{k=1}^{r_i} N_{ijk}\).

Parameters:
datapandas.DataFrame

DataFrame where each column represents a discrete variable. Missing values should be set to numpy.nan.

state_namesdict, optional

Dictionary mapping each variable to its discrete states. If not specified, the unique values observed in the data are used.

Raises:
ValueError

If the data contains non-discrete variables, or if the model variables are not present in the data.

References

[1]

Koller & Friedman, Probabilistic Graphical Models - Principles and Techniques, 2009, Section 18.3.4-18.3.6.

[2]

AM Carvalho, Scoring functions for learning Bayesian networks, http://www.lx.it.pt/~asmc/pub/talks/09-TA/ta_pres.pdf

Examples

>>> import pandas as pd
>>> from pgmpy.models import DiscreteBayesianNetwork
>>> from pgmpy.structure_score import K2
>>> data = pd.DataFrame(
...     {"A": [0, 1, 1, 0], "B": [1, 0, 1, 0], "C": [1, 1, 1, 0]}
... )
>>> model = DiscreteBayesianNetwork([("A", "B"), ("A", "C")])
>>> score = K2(data)
>>> round(score.score(model), 3)
np.float64(-9.875)
>>> round(score.local_score("B", ("A",)), 3)
np.float64(-3.584)