Simple way to implement Naïve Bayes algorithm in Python

Naïve Bayes algorithm is a probabilistic classifier based on Bayes’ theorem, which assumes independence between features. It’s widely used in text classification, spam filtering, and recommendation systems due to its simplicity and effectiveness.

Naïve Bayes Algorithm Explanation

Bayes’ Theorem:
Bayes’ theorem calculates the probability of a hypothesis given the evidence:

\[P(y \mid X) = \frac{P(X \mid y) \cdot P(y)}{P(X)}\]
  • ( P(y \mid X) ): Probability of class ( y ) given the features ( X ).
  • ( P(X \mid y) ): Probability of features ( X ) given class ( y ).
  • ( P(y) ): Prior probability of class ( y ).
  • ( P(X) ): Probability of features ( X ) (normalizing constant).

Naïve Independence Assumption:
Naïve Bayes assumes that the features are conditionally independent given the class:

\[P(X \mid y) = \prod_{i=1}^{n} P(x_i \mid y)\]


This simplifies the calculation significantly.

Classification Rule:
Given a new instance ( X = (x_1, x_2, …, x_n) ), the class ( y ) is predicted as:

\[\hat{y} = \arg\max_y P(y) \prod_{i=1}^{n} P(x_i \mid y)\]

Implementation Process

Let’s implement a simple Naïve Bayes classifier using Python and explain the steps involved.

import numpy as np

class NaiveBayes:
    def __init__(self):
        self.classes = None
        self.class_priors = None
        self.class_conditional_probs = None

    def fit(self, X, y):
        self.classes = np.unique(y)
        self.class_priors = np.zeros(len(self.classes))
        self.class_conditional_probs = []

        # Compute class priors P(y)
        for i, cls in enumerate(self.classes):
            self.class_priors[i] = np.sum(y == cls) / len(y)

        # Compute conditional probabilities P(x_i | y)
        for i, cls in enumerate(self.classes):
            X_cls = X[y == cls]
            class_conditional_prob = []
            for col in X_cls.T:
                class_conditional_prob.append([(np.sum(col == value) + 1) / (len(col) + len(np.unique(col))) for value in np.unique(col)])
            self.class_conditional_probs.append(class_conditional_prob)

    def predict(self, X):
        predictions = []
        for x in X:
            posterior_probs = []
            for i, cls in enumerate(self.classes):
                prior = np.log(self.class_priors[i])
                conditional = sum([np.log(self.class_conditional_probs[i][j][np.where(np.unique(X[:,j])==x[j])[0][0]]) for j in range(len(x))])
                posterior_probs.append(prior + conditional)
            predictions.append(self.classes[np.argmax(posterior_probs)])
        return predictions

Explanation of the Code

Initialization (__init__):

  • self.classes: Stores unique classes from training data.
  • self.class_priors: Stores prior probabilities ( P(y) ) for each class.
  • self.class_conditional_probs: Stores conditional probabilities ( P(x_i \mid y) ) for each feature ( x_i ) and each class ( y ).

Training (fit):

  • np.unique(y): Get unique classes.
  • Compute class priors ( P(y) ).
  • Compute conditional probabilities ( P(x_i \mid y) ) for each feature ( x_i ) and each class ( y ).

Prediction (predict):

  • For each instance ( x ) in test data:
    • Compute the posterior probability ( P(y \mid x) ) for each class ( y ).
    • Select the class with the highest posterior probability as the predicted class.

Usage Example

# Example usage:
X_train = np.array([[1, 1], [1, 0], [0, 1], [0, 0]])
y_train = np.array([1, 1, 0, 0])

X_test = np.array([[1, 0], [0, 0]])

nb = NaiveBayes()
nb.fit(X_train, y_train)
predictions = nb.predict(X_test)

print("Predictions:", predictions)  # Output: [1, 0]

This example demonstrates how to train a Naïve Bayes classifier on a simple dataset and use it to predict the classes of new instances.

In the nub, Naïve Bayes is straightforward yet powerful due to its assumptions and is particularly useful for text classification and other applications where the independence assumption holds reasonably well.

Every journey begins with a single step, but it’s perseverance that carries you through to the destination!!

K

Strength lies not in never falling, but in rising every time we fall!! – K

Success is not final, failure is not fatal: It is the courage to continue that counts!!

K

About the author

pondabrothers

You can download our apps and books for free..
Search - Incognito Inventions

View all posts

Leave a Reply

Your email address will not be published. Required fields are marked *