Measures of Change in the TDB Pilot Program

The impact of the pilot program was measured across three domains: student change, teacher of students who are deafblind change, and systems change. Technical assistance was directly tied to these measures and was provided over a three-year cycle.

Pilot outcome benchmarks included, but were not limited to: programming and teaching strategies specific to deafblind education, use of appropriate assessment tools, support with deafblind child count reporting, establishing teacher of students who are deafblind roles, and collaborative teaming. 

Flow Chart showing the three components of change measured during the TDB Pilot Program. The text below describes this chart.


TDB Pilot Program Flow Chart that is described below.

Student Change:

Ten students were used as case studies during the pilot. As a base line, case study students were not receiving programming specific to being deafblind at the start of the pilot. The numbers below represent change that occurred from the beginning to end of the pilot.

Students with assessment specific to being deafblind – 10 out of 10 (100%)

Student IEP’s created with deafblind strategies, modifications, and accommodations, specific to addressing child’s individual sensory needs

      • 8 out of 10 students have IEP quality indicators in file (80%)
      • 8 out of 10 students have notes indicating progress (80%)
      • 8 out of 10 students have appropriate deafblind specific adaptations, modifications, accommodations (80%)

Student programming specifically related to being deafblind

      • Use of communication systems in all instruction – 10 out of 10 (100%)
      • Use of routines (as appropriate) – 6 out of 6 (100%)
      • Use of DB specific adaptations, modifications, accommodations during instruction – 10 out of 10 (100%)
      • Use of appropriate technology  – 100%

Teacher of Students who are Deafblind Change

Seven teachers were originally enrolled in the pilot. All had either (or both) certification in Deaf/hard of hearing, or visual impairment, with varying experience working with students who are deafblind. Benchmarks for change were based on the established roles of the TDB. Tools for measure included: video logs, action plans, and teacher/student portfolios.

      • Obtained pre-service coursework  – (2 out of 7)
      • Stayed Current in the field –  (7 out of 7)
      • Contributed to the field of deafblind education – (2 out of 5) 
      • Participated as a member of the IEP team providing expertise in deafblind education– (5 out of 7)
      • Participated in IEP and IFSP meetings to insure appropriate programming and services specific to students who are deafblind – (4 out of 7)
      • Provided direct and indirect consult services to the child who is deafblind, educational teams and families – (7 out of 7)
      • Supported the intervener model in local district – (4 of 7)
      • Assisted local district in child find activities for students who are deafblind and completion of the annual TEA Deafblind census – (2 of 7)

Systems Change

Benchmarks for measuring systems change included: increased efficiency in deafblind child count reporting, school program change as noted by increased use of appropriate and specific deafblind strategies, collaboration with outside agencies, and better satisfaction for families and educational professionals in the IEP/FIE processes.

      • Products developed through the pilot included: A Comparison of roles between  TDHH, TVI, SPED, TDB, teachers, the “Roles of the TDB” outlined, teacher of students who are deafblind Pilot website, and a Collaborative model for TDHH/TVI teachers to support students who are deafblind.
      • Established support for deafblind specific college coursework through a grant from the Texas Education Agency.
      • Established TDBs assigned to deafblind student specific caseloads.
      • Piloted TDB collaboration with NCDB to create new intervener coursework modules (OHOA).