The core research team began to form in early 2013. The core team members
guided the overall project, and they served as liaisons to individual country investigators (CIs) and
provided them with support. Each country also had its own team. Each country team could have one to
three CIs, although, in rare cases, we approved four members. Figure 3 (see next page) summarizes the
The survey instrument constituted the core team’s first critical deliverable. The process of refining the list
of potential items took over a year. We thought it important to use previously validated items for the
constructs so the instrument had good psychometric properties. We soon realized that gathering data on
so many independent and dependent variables meant that a long survey. At the same time, we needed to
shorten the instrument as much as possible to encourage participants to complete it. We did our best on
both ends. Ultimately, the instrument contained 160 items, and the pilot tests reported that one could
complete it in about 25 minutes. Pilot tests also helped us in refining the instrument. We froze the
instrument (i.e., no further changes) at the end of 2013 with one exception. In special cases, CIs could
add a few of their own questions, but in no case could they replace, modify, or eliminate any of the
questions so that data collection would be uniform across all countries. The Institutional Review Board at
the University of North Carolina at Greensboro reviewed and subsequently approved and exempted the
instrument from further review.
After we finalized the instrument, the core team created templates for recruiting country teams and
country investigators. These templates included a memorandum of understanding (MOU) that outlined the
roles and responsibilities of the CIs and the core team and a request for survey administration plan
(RSAP) so that CIs could submit their data-collection plan to the core team for approval. Once the core
team approved the RSAP and both parties signed the MOU, we shared the instrument with the CIs. We
also instructed the CIs to receive their own institutional review board clearance if necessary.
We solicited CIs organically through professional contacts and conferences, requests on the AISWorld
listserv, and direct emails to faculty listed on the AIS faculty directory. We also organized information
sessions at the GITMA and AMCIS conferences in 2013 and 2014. In each communication, we described
the benefits to the CIs and their roles and responsibilities (see Table 3 two pages on). The response was
slow and steady but, at the same time, encouraging. It also meant patience and perseverance on our part.
We sought to obtain data from countries that represented every major region of the world. The project
required data collection from countries that represent different cultures, levels of economic growth,
religious beliefs, and political systems. We recruited and selected CIs after a careful screening process.
We needed local country investigators because they understand the local culture and how to best
approach local businesses to participate. We also charged them with translation/back-translation3 of the
instrument (if translation to the local language was necessary) in order to ensure that wording and
meaning were appropriate for the local culture. Given the wide disparity in population sizes and
development levels among the countries in the world, we could not feasibly achieve true representative
sampling in total; however, we sought a large database that would be respected for its breadth of cultures
We gave the CIs discretion in what method they used to collect data and how to approach organizations in
their country. All respondents needed to be in the IT profession. We instructed the CIs to provide us with a
minimum usable sample size of 300 IT employees in their country. We suggested they collect responses
from 10 to 15 IT employees from 20 to 30 companies in a variety of industries. The CIs used multiple
data-collection methods: mail surveys, face-to-face surveys, email surveys, and Web-administered
surveys. We assured and observed complete anonymity in collecting data. We also recommended to the
CIs that they go through the CEO/CIO or another senior executive to recruit multiple IT employees from
the same organization, and most CIs followed this practice. It resulted in both a higher response rate and
higher data quality.
The data collection was completed at the end of 2017. We have full data from 37 countries comprising of over
11,000 observations. Table 4 (see next page) shows the list of countries participating in the World IT Project.