Lecture 3 Ferrario Flashcards
Describe the multifaceted nature of AI research
AI involves various elements like people, structures, organizations, physical systems, and tasks.
What is the importance of explanations in AI?
Explanations play a critical role in AI, especially in how humans interact with AI systems. They provide transparency and understanding of AI systems’ decisions and actions. Explanations build trust, enable user to verify results, and help identify biases or errors. They ensure accountability and fairness in AI application and fosters a more collaborative way of working.
Explain the life cycle perspective of AI systems
It includes desing, development, deployment and maintenance. This ensures that AI systems are carefully planned, developed and managed throughout their lifespan
explaint is the ‘naive XAI hypotheis in explainable AI?
It suggests that if an AI’s process is explainable, it will foster more trust from users.
What are some issues faced by the hypothesis of explainability and trust in AI?
Challenges in measuring trust and generalizing emprical results.
Lack of consenus on what explainability and trust is.
Whare are issues faced by the hypothesis of explainability and trust in AI ?
When AI systems are explainable, users can understand how they make decisions, increasing transparency and building trust. However the relationship is complex. factos like specific context of their use, and the limitations in defining and measuring trust. A critical evaluation is necessary.
What are the significant aspect of contextualizing explanations in AI?
Understanding who is explaining what to whom, the models or types of explanations available, the criteria for a explanation and how the explanation should be accepted and acted upon