Jupyter Notebook 1.5 - Regression and 1.6 - Hyperparameter Flashcards

1
Q

How can we convert features like mileage, engine size and max power from text (String) to numerical values?

A

df[‘mileage’].replace({‘ kmpl’: ‘’, ‘ km/kg’: ‘’}, regex=True, inplace=True)
df[‘mileage’] = pd.to_numeric(df[‘mileage’])
df[‘engine’].replace({‘ CC’: ‘’}, regex=True, inplace=True)
df[‘engine’] = pd.to_numeric(df[‘engine’])
df[‘max_power’].replace({‘ bhp’: ‘’, ‘’: np.nan}, regex=True, inplace=True)
df[‘max_power’] = pd.to_numeric(df[‘max_power’])

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is a hyperparameter?

A

A hyperparameter is a setting that you choose before training a model, like the learning rate or the number of layers in a neural network. Unlike other model parameters, it is not adjusted automatically during training but can affect how well the model performs.

For example RandomForestClassifier(max_depth = 2, n_estimators = 100, …) are hyperparameters

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How do we select good hyperparameters in machine learning?

A

Selecting good hyperparameters is like tuning a musical instrument to find the best settings for performance. Hyperparameters are set before training and can’t be learned directly from the data. To find the best ones, we usually try different settings manually or use methods like GridSearch or RandomSearch. More advanced techniques, like AutoML, use reinforcement learning or evolutionary algorithms to automate this process.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are the common methods for searching good hyperparameters?

A
  1. Brute-force search: Try all parameter combinations within a specified range
  2. Random search: Try out random combinations of parameter settings within a specified range
  3. Grid Search: Tries all possible combinations of hyperparameters in a defined range but is computationally expensive.
  4. Random Search: Tries random combinations of hyperparameters, which is faster but may miss the best settings.
  5. Bayesian Optimization: Uses results from previous trials to choose the next hyperparameter setting, aiming for smarter optimization.
    Bayesian optimization is often more efficient for large search spaces than grid search or random search.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is one-hot encoding in machine learning?

A

One-hot encoding is a technique used to convert categorical data into a numerical format. It creates a new binary column for each category, where a value of 1 represents the presence of the category, and 0 represents its absence. For example, if you have a “color” feature with values “red,” “blue,” and “green,” one-hot encoding will create three new columns: one for each color. Only one of these columns will have a 1 for each data point, and the rest will be 0.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How does grid search work in machine learning?

A

Grid search is a method used to find the best hyperparameters by trying all possible combinations of a set of hyperparameter values. You define the hyperparameters to explore in a Python dictionary, where each key is a hyperparameter and each value is a list of options to try. Grid search will then train a model for each combination and select the one with the best performance.

We can search over all the 5∗3∗3=45 different settings, using 3-fold cross validation to check the performance of each one (training the model a total of 45∗3=135 times):

For example:
~~~
param_grid = {
‘n_estimators’: [10, 50, 100, 150, 200],
‘max_depth’: [5, 50, None],
‘max_features’: [2, 3, None]
}

gs_reg = GridSearchCV(estimator=rf, param_grid=param_grid, cv=3, verbose=1, n_jobs=-1)

~~~

Dictionary: dict = { ‘monthly payment’: 20000 }

Define the dictionary
payment_info = { ‘monthly payment’: 20000 }

Access the value of ‘monthly payment’
monthly_payment = payment_info[‘monthly payment’]
print(f”The monthly payment is: {monthly_payment}”)

Output
The monthly payment is: 20000

How well did you know this?
1
Not at all
2
3
4
5
Perfectly