[ad_1]

## Exploring the fundamentals of logistic regression with NumPy, TensorFlow, and the UCI Heart Disease Dataset

**Outline:**

1. What is Logistic Regression?

2. Breaking Down Logistic Regression

1. Linear Transformation

2. Sigmoid Activation

3. Cross-Entropy Loss Function

4. Gradient Descent

5. Fitting the Model

3. Learning by Example with the UCI Heart Disease Dataset

4. Training and Testing Our Classifier

5. Implementing Logistic Regression with TensorFlow

6. Summary

7. Notes and Resources

Logistic regression is a supervised machine learning algorithm that creates classification labels for sets of input data (1, 2). Logistic regression (logit) models are used in a variety of contexts, including healthcare, research, and business analytics.

Understanding the logic behind logistic regression can provide strong foundational insight into the basics of deep learning.

In this article, we’ll break down logistic regression to gain a fundamental understanding of the concept. To do this, we will:

- Explore the fundamental components of logistic regression and build a model from scratch with NumPy
- Train our model on the UCI Heart Disease Dataset to predict whether adults have heart disease based on their input health data
- Build a ‘formal’ logit model with TensorFlow

You can follow the code in this post with my walkthrough Jupyter Notebook and Python script files in my GitHub learning-repo.

Logistic regression models create probabilistic labels for sets of input data. These labels are often binary (yes/no).

Let’s work through an example to highlight the major aspects of logistic regression, and then we’ll start our deep dive:

Imagine that we have a logit model that’s been trained to predict if someone has diabetes. The input data to the model are a person’s

age,height,weight, andblood glucose. To make its prediction, the model will transform these input data using thelogisticfunction. The output of this function will be a probabilistic label between0and1.The closer this label is to1, the greater the model’s confidence that the personhas diabetes, and vice versa.Importantly: to create classification labels, our diabetes logit model first had to

learnhow to weigh the importance of each piece of input data. It’s probable that someone’sblood glucoseshould be weighted higher than theirheightfor predicting diabetes. This learning occurred using a set of labeled test data and gradient descent. The learned information is stored in the model in the form ofWeightsandbiasparameter values used in the logistic function.

This example provided a satellite-view outline of what logistic regression models do and how they work. We’re now ready for our deep dive.

**To start our deep dive, let’s break down the core component of logistic regression: the logistic function.**

[ad_2]

Source link