The Distance Education Program Effectiveness Guidelines is a comprehensive manual that guides Program Coordinators step-by-step through the documentation process. We recommend reviewing the manual before identifying the Sources of Data that will be used for the review. The manual describes key components of learning and experience sources and includes:

  • An introduction to DE Program Effectiveness reporting at Texas A&M University
  • How to use HelioCampus to document the DE program's Sources of Data & Findings/Use of Results
  • Descriptions of appropriate sources of data
  • Detailed information about each section of the required documentation, including examples

Recent Updates:

  • February 2026: Added information about key differences between 25-26 and 26-27 assessment forms; namely, the addition of a Final Approver (Department) workflow step, minor workflow step name changes, and minor updates to item prompts for clarity.

Minimum Requirements for DE Effectiveness Reporting

Sources of Data
  • Two sources of data are identified. 
    • One for student learning (e.g., PLO or CLO assessment) 
    • One for student experience (e.g., SCEs, student survey, graduation rates, retention, etc.)
  • Comparable traditional program(s)/offering(s) are identified.
Findings & use of Results
  • Findings are contextualized and appropriately compared to the traditional program offering.
  • Findings must be disaggregated if the report includes: 
    • Multiple DE credentials in the same report
  • Regularly low-enrolled programs must aggregate their data across cycles in order to report findings annually.
    • Undergraduate: Min. of 10 students for reporting
    • Graduate: Min. of 5 students for reporting
  • The Use of Results section describes an action to improve the overall student experience. 

Compliance Indicators


As of the 2025-2026 reporting cycle, DE Liaisons within each academic college/school provide feedback at two stages of reporting: (1) Sources of Data and (2) Findings & Implications. Program Coordinators receive an email notification when DE Liaisons submit feedback, at which point they are able to make necessary updates to the reporting form.

At the close of each DE Program Effectiveness reporting cycle, OIEE staff review all DE reports, provide final comments, and assign a Compliance Indicator based on the degree to which reporting requirements were met. Compliance Indicators are used by OIEE primarily to determine which programs might require additional support and guidance in future reporting cycles.

Rubric
Compliance Indicator Description of Criteria
Exemplary
The report goes beyond minimum requirements. This may include the following: 
  • At least three sources of data are identified and thoroughly described (e.g., one learning outcome and two student experience). 
  • Comparable traditional program(s)/offering(s) are identified.* 
  • Includes consideration of the needs and circumstances unique to the program’s mode of delivery. 
  • Findings are contextualized and appropriately compared and/or disaggregated (e.g., if multiple DE credentials are included in the same report).
  • Clear and detailed explanation as to how the results will be used for DE program improvement.
Sufficient
All minimum requirements were met:
  • Two sources of data are identified and described (i.e., one student learning and one student experience).
  • Comparable traditional program(s)/offering(s) are identified.*
  • The final report is generally clear and well-aligned. Some sections could be strengthened by including more detail or information.
  • Findings are contextualized and appropriately compared and/or disaggregated (e.g., if multiple DE credentials are included in the same report).
  • Explanation of how the results will be used for DE program improvement is included. Implementation may not be fully clear, but the intent is.
Needs Improvement
All minimum requirements are met, but one or more of the following is true:
  • Both required sources of data are described/reported, but the descriptions are insufficient or unclear.
  • Data lacks meaningful comparison.
  • Explanation of how the results will be used is vague or lacks clear a connection to the reported findings and/or mode of delivery.
Noncompliant
Report was not submitted, or one or more required components of the report is missing (sources of data, findings, use of results, comparable programs/contextualization of data). The report does not demonstrate continuous improvement.

*If the program is not offered both via DE technology and face-to-face (FTF), a reasonable comparator might be a FTF program in the same department at the same degree level, or even in a different department within the same college. If a reasonably comparable FTF program cannot be identified, determine whether there are key courses offered in both modalities that could be compared via Student Course Evaluation results. If no such courses exist, or if there is not a reasonable comparator program, clearly explain the unique nature of the program in the report and include sources of data that explicitly speak to the student experience in a distance education program.