Validation

TBI’s indexes are developed based on a comprehensive analysis of the latest theory, law, regulation and practice associated with board governance. The questions were derived from a much larger pool of content. Content validation followed a scientific process as follows.

  • Structure.  The areas of relevance for boards were determined from the empirical literature.
  • Screening.  Multiple questions were developed for each area. An iterative review by content experts refined the questions.
  • Expanding the set.  Content experts have developed content regarding leadership, legal, process and management concerns that impact boards.
  • Relevance.  Instrument analysis by several experts narrowed the question set from several hundred to less than 100.
  • Alpha tests.  A series of early instrument development tests using content experts, actual board members, university students and a set of others provided considerable insight on the quality of TBI content.
  • Item analysis.  The items were analyzed using standard statistical procedures to examine each question or item.  Item analysis typically assumes relatively large sample sizes but the results were helpful in editing or eliminating several questions.
  • Factor analysis.  Responses from an initial set of respondents were used to identify redundant constructs. These constructs essentially captured the same content.  Redundant constructs were considered and most were dropped.
  • Globular clusters.  Several readings of the response set by experts identified items that addressed multiple issues within a single question.  These clusters were separated by making the question into several questions or dropping the least relevant cluster elements.
  • Reliability.  Instrument analysis includes both verbal respondent analysis and item variance.  The process identified about 15 constructs that seemed to be difficult for respondents to assess.  These constructs were re-written, divided or dropped to improve item reliability.
  • Applicability.  Some items were found not to be relevant or known by some board members.  These questions were either segmented for a single administrator response or dropped from the instrument.
  • Changeable.  Several items were found not to be trainable or changeable and were dropped for lack of relevance.
  • Questions not asked.  One of the most difficult issues with an assessment tool is that respondents can only answer questions asked.  TBI addresses the most relevant board issues but must limit questions to avoid respondent fatigue. Three solutions are provided for the probability that questions were not asked:
  • An open-ended question at the end of the instrument asks respondents to identify the most critical issues facing the board.
  • An open-ended question at the end of the instrument asks respondents for comments on any relevant issues not addressed by TBI.
  • The instruments are designed to be readily customizable, so that questions important to a particular organization, may be added.
  • Beta Tests.  A series of beta tests using board members again provided a wealth of information on content relevance, understandability, and reliability.  Every board that participated in early TBI development provided feedback on the methods, process, and content.
  • Continuous review. TBI is designed for continuous content review and validation.  The extensive reporting that is prepared for each board provides considerable knowledge about item relevance and reliability.  Summary reports across multiple boards are also designed to detect item variance, scale issues or other factors that could potentially indicate difficulties with constructs.
  • Empirical analysis.  TBI includes a data warehouse feature that enables analysts to examine segments of boards as well as all boards.  These features apply the latest technologies for validation, creating normative comparisons and improving content.  The empirical analysis capability essentially creates an expert system which organizations may use to detect issues on their boards relative to other similar boards.

A Penetration Test was completed by a prestigious independent, third party to validate the security and efficacy of the software.

Free demo