NRC Software Flashcards

1
Q

Talk about the programming projects you did with NRC

A

o developing ETL pipeline between SCADA facility and centralized database

  • At National Research, my main project was to integrate the data from our SCADA system (array of different machinery) into a data warehouse.
  • First part of my work term was designing and launching the data warehouse in collaboration with the engineering team, gathering all the business and research requirements and utilizing that to design the schema
  • Second part was creating an ETL pipeline, which was creating software that pulled activity data from all the facility machinery and storing it within the data warehouse
  • Last part was implementing an ML model that applied linear regression to the stored data to predict material and resource consumption
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Talk about the database project you did with NRC

A

o Designed, launched, and managed a PostgreSQL database.
o Performed schema design, normalization of data structures, and optimization for performance and scalability in consultation with engineering team.
o Set up backup and archiving procedures using automatic backup commands and point-in-time archiving

In one of my recent projects, I was responsible for designing, launching, and managing a PostgreSQL database for the Advanced Manufacturing Division. I worked closely with the engineering team to design the schema. We also optimized it to make sure the database could scale while still performing well, which was a key consideration as we expected the system to handle more traffic over time.

I also set up a backup and archiving process. I used automated backup commands to ensure we had regular backups in place, and I implemented point-in-time archiving so we could recover the database to any specific moment if something went wrong. This gave us a lot of flexibility and confidence in case of any issues.

Overall, this experience gave me a strong understanding of both database design and the operational side of maintaining a scalable, high-performance system.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Talk about the research you did with NRC

A

o Conducted in-depth research on metal binder jetting and powder metallurgy technology.
1. Researching and determining how to make 3D metal printing as cheap and sustainable as possible while maintaining quality
2. Analyzing data to find correlations between variables and print quality, ie., temperature and humidity and how they relate to resilience of printed materials and resources consumed

o Presented research data and software to manufacturers to aid in improvement initiatives and sustainability standards.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Give us a description of what you did with NRC

A

o Managed data integration project by developing ETL pipeline for our SCADA facility
o Designed, launched, and managed a PostgreSQL database for the Advanced Manufacturing Division.
o Implementing ML Linear Regression Model

o Secondary responsibilities were conducting research and presenting findings to manufacturers to aid in product iterations and enhance sustainability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Explain bayshore concepts

A

Key Points:

Objective: Optimize the resiliency of 3D-printed metal materials by testing how temperature and humidity levels affect the material's performance, measured through rebound height.

Bayshore Resilience Method: A weighted ball is dropped from a fixed height onto the specimen. The rebound height measures the material's resiliency and indicates hysteretic energy loss (lower loss means higher resilience).

Factors (Inputs):
    Temperature (°C)
    Humidity Level (%)

Response (Output):
    Rebound Height (mm) – Higher rebound height indicates better resilience.

Design of Experiment (DOE):
    Test at different levels of temperature and humidity, e.g., three levels for each factor (20°C, 50°C, 80°C for temperature and 20%, 50%, 80% for humidity).
    Collect data for each combination.

Modeling Using RSM:
    Fit a second-order polynomial model to understand the relationship between the factors and rebound height.
    The model captures main effects and interactions between temperature and humidity.

Response Surface Plot:
    Create a 3D response surface plot to visualize how temperature and humidity affect the rebound height.
    This helps identify the optimal conditions for maximizing resilience.

Analysis and Insights:
    Analyze the plot and fitted model to find the combination of temperature and humidity that maximizes the material’s resiliency.
    Example: The optimal temperature might be around 50°C with a humidity level of 20%, yielding the highest rebound height.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Explain bayshore testing and data analysis

A

In one of my projects, I applied Response Surface Methodology in testing how temperature and humidity levels affect specimen resiliency, which we measured by the rebound height of a weighted ball dropped from a fixed height. The rebound height indicated the material’s resiliency, with higher rebound heights suggesting better resiliency and lower energy loss.

I collected data from our tests, and then used a polynomial model to analyze the factors and results. After fitting the model, I created a data model using a 3D response surface plot using Python (scikit and matplot) to identify the optimal conditions. For example, you would be able to look at the data model and see the highest peaks indicating the most resilient specimens and their printing conditions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Explain your ML experience in detail

A

In my previous role, I developed machine learning models to predict material and supply consumption, with the goal of improving sustainability and efficiency.

I designed a custom ETL system to collect data from various machines, pulling in parameters like temperature, humidity, adhesive type, and metal type. The models I built helped identify how these factors influenced print quality and resource consumption.

When it came to ML development, I started with the preprocessing of data, I used Python and pandas to clean the data—handling tasks like normalization and dealing with missing data to make it suitable for model training.

When it came to operational development, I worked with scikit-learn to apply linear regression to predict material usage.

Then, I set up a monitoring system using Bash and Python to track the model’s performance over time and established a feedback loop to retrain the models periodically as new data came in.

Overall, this project gave me hands-on experience with the entire ML lifecycle—everything from data collection and preprocessing to model development, deployment, and continuous monitoring.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly