Introduction - If you have any usage issues, please Google them yourself
z=bayes_classifier(m,S,P,X). This function outputs the Bayesian classification
rule for M classes, each modeled by a Gaussian distribution.
where,
M: the number of classes.
• l: the number of features (for each feature vector).
• N: the number of data vectors.
• m: lxM matrix, whose j-th column corresponds to the mean of the j-th class.
• S: lxlxM matrix. S(:,:,j) is the covariance matrix of the j-th normal distribution.
• P: M-dimensional vector whose j-th component is the a priori probability of the j-th class.
• X: lxM data matrix, whose rows are the feature vectors, i.e., data matrix in scikit-learn convention.
• y: N-dimensional vector containing the known class labels, i.e., the ground truth, or target
vector in scikit-learn convention.
• z: N-dimensional vector containing the predicted class labels, i.e., the vector of predicted class
labels in scikit-learn convention.