ann model Flashcards
What is Sequential in TensorFlow?
In TensorFlow/Keras, Sequential is a simple way to build a feedforward neural network (ANN, CNN, etc.). It stacks layers one after another in a linear fashion.
How to import sequential model?
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
How to Create a Sequential Model for regression?
model = Sequential([
Dense(16, activation=’relu’, input_shape=(X_train.shape[1],)), # Input layer
Dense(8, activation='relu'), # Hidden layer Dense(1) # Output layer (Regression problem, no activation function) ])
How to create sequential model for binary classification ?
model = Sequential([
Dense(64, activation=’relu’, input_shape=(10,)),
Dense(32, activation='relu'), Dense(1, activation='sigmoid') # For binary classification ])
How to create sequential model using .add() method?
model = Sequential()
model.add(Dense(64, activation=’relu’, input_shape=(10,)))
model.add(Dense(32, activation=’relu’))
model.add(Dense(1, activation=’sigmoid’))
What are the common layers used in sequential?
Dense(units, activation) – Fully connected layer
Conv2D(filters, kernel_size, activation) – CNN layer
Dropout(rate) – Reduces overfitting
Flatten() – Converts multi-dimensional input to 1D
Give example of sequential model with 20% dropout
model = Sequential([
Dense(128, activation=’relu’, input_shape=
(784,)), # Input Layer
Dropout(0.2), # Dropout layer (20% neurons
randomly dropped)
Dense(64, activation='relu'), # Hidden Layer Dropout(0.2), Dense(10, activation='softmax') # Output Layer (for classification) ])
How to import dropout?
from tensorflow.keras.layers import Dropout
model.add(Dropout(0.2))
# Drops 20% of neurons randomly
How to Compile Sequential Model in Keras? and explain parameters
model.compile(optimizer=’adam’,
loss=’categorical_crossentropy’,
metrics=[‘accuracy’])
After defining a neural network model in Keras, you need to compile it before training. Compilation configures the model with:
✅ Optimizer → Decides how the model updates weights
✅ Loss Function → Measures how wrong the predictions are
✅ Metrics → Tracks performance (e.g., accuracy)
Explanation of Parameters
1️⃣ Optimizer → Controls how weights are updated
‘adam’ → Best for most cases (adaptive learning)
‘sgd’ → Slower but works well for simple models
‘rmsprop’ → Good for RNNs
2️⃣ Loss Function → Measures how far predictions are from actual values
For Classification:
‘categorical_crossentropy’ (Multi-class with one-hot labels)
‘sparse_categorical_crossentropy’ (Multi-class with integer labels)
‘binary_crossentropy’ (For binary classification)
For Regression:
‘mean_squared_error’ (MSE)
‘mean_absolute_error’ (MAE)
3️⃣ Metrics → Used for evaluation
‘accuracy’ → Works for classification
‘mae’ → Mean Absolute Error (for regression)
How to Train Sequential Model?
history = model.fit(X_train, y_train, epochs=100, batch_size=32, validation_data=(X_test, y_test))
epochs=100 → Number of times model sees data
batch_size=32 → Number of samples processed at once
validation_data=(X_test, y_test) → Checks performance on unseen data
How to Predict for new data?
For Regression (Continuous Output)
predictions = model.predict(X_test)
print(predictions[:5]) # Show first 5 predictions
Output: [250000, 180000, 320000, 410000, 150000] (Predicted house prices)
For Binary Classification (Yes/No, 0/1)
predictions = model.predict(X_test)
predicted_labels = (predictions > 0.5).astype(int) # Convert probabilities to 0 or 1
print(predicted_labels[:5])
Output: [0, 1, 0, 0, 1] (Spam or Not Spam)
For Classification (Multi-Class)
predictions = model.predict(X_test)
predicted_classes = predictions.argmax(axis=1) # Get class with highest probability
print(predicted_classes[:5])
Output: [1, 0, 2, 1, 2] (Predicted class indices)