Principles of Assessment, Prescription, and Exercise Program Adherence Flashcards
Exercise Specialist’s Responsibilities
Educate clients, Conduct health screening and testing, Design exercise programs, Critique clients’ exercise performance, Motivate clients
Components of Physical Fitness
Cardiorespiratory endurance, Musculoskeletal fitness, Body weight and body composition, Flexibility, Balance
Purposes of Fitness Testing
Health screening, Develop a Fitness profile, Identify physical abilities in need of improvement, Set goals, Aid in exercise prescription, Evaluate progress
How to identify physical abilities in need of improvement
Appropriate testing measures and analysis can determine which physical qualities of an individual can be targeted in prescribed exercise programs.
Test
A procedure for assessing ability in a particular endeavor
Field test
A test used to assess ability that is performed away from the laboratory and does not require extensive training or expensive equipment
Measurement
The process of collecting test data
Evaluation
The process of analyzing test results for the purpose of making decisions
Midtest
A test administered one or more times during the training period to assess progress and modify the program as needed to maximize benefit
Formative evaluation
Periodic reevaluation based on midtests administered during the training, usually at regular intervals
Posttest
A test administered after the training period to determine the success of the training program in achieving the training objectives
Validity
The degree to which a test or test item measures what it is supposed to measure (One of the most important characteristics of testing) Validity coefficient, SEE, Sensitivity and specificity
Construct Validity
The ability of a test to represent the underlying construct (the theory developed to organize and explain some aspects of existing knowledge and observations).
Face Validity
The appearance to the athlete and other casual observers that the test measures what it is purported to measure.
Concurrent Validity
The extent to which test scores are associated with those of other accepted tests measuring the same ability.
Predictive Validity
The extent to which the test score corresponds with future performance or behavior.
Discriminant Validity
The ability of a test to distinguish between two different constructs.
Measurement error can arise from the following
Intrasubject (within subjects) variability, Lack of interrater (between raters) reliability or agreement, Intrarater (within raters) variability, Failure of the test itself to provide consistent results
Intrasubject Variability
The lack of consistent performance by the person tested.
Interrater Reliability
The degree to which different raters agree; also referred to as objectivity or interrater agreement.
Intrarater Variability
The lack of consistent scores by a given tester.
Selection and Training of Testers for Test Administration
Provide testers with practice and training, Ensure consistency among testers
Recording Forms for Test Administration
Prepare scoring forms ahead of time to increase efficiency and reduce recording errors
Experience and Training Status on Test Selection
Consider the client’s ability to perform the technique; Consider the client’s level of strength and endurance training.