SciWise research initiative aims to improve science engagement practices through stakeholder participation, support education and build understanding of how attitudes and actions development for public audiences who visit science organisations and events, such as science centers, festivals and museums.
Furthermore, this initiative brings together science researchers and practitioners towards common goals in an improvement-oriented network. With the goal of collective improvement, this initiative enables self–monitoring, collaboration and sharing of 'good' practices.
Finally, this mission is achieved through use of the most innovative and up-to-date social science research methods and technologies that enable impact evaluation and facilitate the investigative capacities required to carry out this mission.
Facilitate sharing information community.
Improve community with good information.
Report findings through publications.
This project addresses common constraints and acute limitations to robust, quality and effective visitor research and impact evaluation (e.g. Jensen 2014).
The SciWise research project extends survey designs and methods from other recent global research projects. Information collected in this project supports better understanding of key areas for visitor research.
SciWise connects to the community of practice, ensuring audience insights are useful, shareable, and effectively measure visitor experiences and learning impacts (e.g. Dawson & Jensen 2011).
Participating science institutions are empowered by strong evidence delivered efficiently through automated impact evaluation, with real-time and ongoing insights into visitor experience and outcome, meaningful comparisons and sector benchmarks.
Participation in the SciWise audience research project includes the following project features:
The SciWise project team ensures fit between audience research and evaluation methods and your institution’s unique requirements. Design for up to 25 questions with response options (more can be included but we would advise caution in exceeding this number for the sake of respondents’ satisfaction with the evaluation experience). Research questions will be confirmed through consultation with your institution.
Preparation of a staff or visitor-facing method to collect email addresses used to enroll visitors into the evaluation. This will include at least one demographic/visitor type variable (e.g. visit type: Family, individual, couple, other) and one impact-related variable. The online survey is built and provided to your institution with embed code which allows the survey to be integrated with the your institution’s web page and/or booking systems (optional add-on). This part of the survey system delivery process is guided by the SciWise technical team.
Visitors are sent automated invitations at the close of business on the visit day. The invitations ask visitors to complete the online survey “housed” on your institution's website.
The system provides a flexible platform for evaluation and a basic level of analysis. Your institution stakeholders will be provided access to the Dashboard with pre-configured graphs that will generate automatically and in real-time when surveys are completed, with benchmarking against other institutions on an aggregate level where applicable. The statistical analysis of data can be modified upon request, as well as implementing more advanced statistics when required for reporting. Your institution will also have access to publications from the overall SciWise project.
Data are collected and stored on a server with automatic back-up. Data are always available for download and owned by your institution.
This initiative addresses common constraints on conducting robust, quality and effective visitor research and impact evaluation (e.g. Jensen 2014).
The SciWise research initiative draws on survey designs and methods from recent global research projects. Information collected in this project supports better understanding of key science engagement outcomes.
The cornerstone of SciWise research is its benchmarking objectives and capabilities, made possible through expert-designed questions, survey methodology and technology. Benchmarking provides points of comparison with other participating institutions in Key Benchmarking Areas to enable shared knowledge in pursuit of good practice.
Benchmarking in this initiative gives participating institutions real-time insights about how audience experience and educational impacts change internally over time and in comparison to other institutions nationally and globally in key benchmarking areas.
These key benchmarking areas have been developed in part based on international guidelines published by science associations and national regulations. Questions are organized by visit-related categories and the initiative’s key benchmarking areas.
Understanding your organization’s audience profile is important for targeting audience development initiatives as well as contextualizing survey responses. Automated analytics show whether there are systematic demographic patterns in terms of participation or outcomes.
An essential concern for any science engagement organization or event is the quality the visitor experience. From the food and facilities to talks and interactions with staff or volunteers, monitoring the visitor experience in real-time allows practitioners to intervene as soon as a problem emerges.
Increasing public engagement of science is a major content theme highlighted in science engagement institutions’ policies and guidelines. SciWise enables participating organizations to see their impact on science awareness and attitudes with real-time results benchmarked by institution type, national and global averages.
This initiative enables comparative impact evaluation at a regional, national and global level. This is accomplished using innovative research technology, standardized systems for comparative benchmarking and a focus on audience experiences and educational impacts.
Comparisons are based on standardized measurement tools to understand audience profiles, experiences and outcomes relating to science engagement. This initiative uses benchmarking to aid in discovery and distribution of best practices.
The survey system used in this project has been built to enable best practice for research and impact evaluation. Therefore, participation in this project includes impact evaluation as a standard feature, allowing pre-visit and post-visit data collection at a minimum. As well, long-term follow-up surveys are possible with additional consideration that, for example, are sent to visitors three, six, or nine months after their initial visit. The system is fully user-friendly and accessible to institutional users with no previous research expertise.
Learn more about SciWise's automated evaluation system and options available for data collection.
Once you join SciWise, you get full access to the Question Bank. But what is it and how does it work?
For the first time, science institutions can have ongoing, long-term and integrated evaluation of audience profile, visitor experience, membership retention patterns and educational impacts, with real-time comparisons to sector benchmarks.The mission and objectives are enabled with innovative technology that automates the process of data collection, aggregation and processing, sector-level analysis, and evaluation reporting. Audience data are analyzed automatically within a project dashboard that provides each institution with benchmark scores for its comparison groups.
To enable science institution stakeholders to target improvements, information provided by visitors will automatically be presented to institutional stakeholders in a dashboard.
Participating institutions will gain a more accurate insight into visitor experience through automated impact evaluation with built-in benchmarking.
This initiative is guided by world-leading expert advisors from around the world. Advisors help to orient the initative and ensure its impact is maximized for participating science practitioners and their public audiences.
The initiative’s research team facilitates the research and evaluation process using the SciWise survey tools, supporting science engagement institutions each step of the way from initial interest to real-time benchmarked analytics.
Three different surveys are used for the data collection:
The enrollment survey is designed to gather a few details quickly from visitors, including name, e-mail and visitor type. This is also an opportunity for asking impact items pertaining to a key concept, such as ‘science’. Options to enroll visitors include:
|Entry Point Data Collection|
|Option 1||Visitors are invited to participate upon entering the facilities, either at a ticket desk or just past the entry gate.|
|Option 2||Systematic sampling should be used. Examples include selecting every other visiting group or every 5th visitor.|
|Option 3||Data can be entered by either visitor or staff, using an iPad or Android tablet to register data. A wide range of low cost devices can be used.|
|Booking/Ticketing System Integration|
|Option 1||An integration (“hook point”) can be set up between the enrolment survey and your institution’s booking system to gather visitor Name and Email.|
|Option 2||Other information can be gathered at your institution’s discretion (e.g. ticket type or purchase amount).|
|Option 3||This add-on can run parallel to other collection methods, such as face-to-face (may incur an additional setup).|
Your visitors can access the main survey after their visit to your institution. Invitation e-mails for the main survey are batched at the end of the day (EoD) and sent by the survey system automatically to visitors’ e-mail accounts, inviting them to participate. Visitors can use their preferred electronic devices, such as personal computer, tablet or smartphone.
We enable longitudinal impact evaluation with the survey system, whereby a third and final survey invitation is sent to visitors up to 9 months after their visit (this timeframe should be discussed).
This survey asks visitors to answer parallel impact questions and, in accordance with your evaluation requirements, focuses on the de-facto impact regarding pro-conservation actions that visitors have taken after their visit (e.g., visiting other zoos, joining wildlife protection organisations, or reading up on conservation issues).
The cornerstone of SciWise research is its benchmarking capabilities, which is made possible through expert-designed questions and survey methodology. The Question Bank used for this project is available to participating science organisations at sign up.
Support is provided for developing the methods and survey design to best meet your institution’s research and evaluation needs. Project advisors help to provide practical expertise and the SciWise delivery team implements the survey and technical systems.
Your institution's stakeholders should collaborate on the survey design and provide measurement objectives to help guide data collection in areas of interest.
The survey design can address a range of interests including visitor experiences, service quality, and educational impacts. If impact evaluation has been communicated as the central concern of your institution's stakeholders, then items may be exchanged to include emphasis on emotional, attitudinal, learning or behavioral impacts pertaining to science education. In particular, these impacts may include the following:
Survey elements focusing on marketing and service elements (e.g. food and facilities) would be included in this survey design to the extent it is relevant to potential educational impacts.
The audience intelligence generated can be used to inform internal improvement efforts at your institution and help support claims of educational impact in response to public request for evidence of impact and relevance to science education. It may also provide a means to demonstrate particular standards in relation to national science associations or licensing agreements.
All data, analytics and reporting are accessible at all times through an integrated dashboard provided to your organisation's stakeholders. Where applicable, elements are also benchmarked against other institutions participating in the SciWise Project on an aggregate level. The following methods are standardized, but can be adjusted based on specific request.
Using the calculator:
The tool above will estimate participation costs for your instituion. Simply select your institution type and an estimated annual number of visitors per year that visit. A quote will be prepared for you.
These costs are per site of data collection. Proposed costs are valid for up to 30 days.
|Below 200,000||Below 100,000||Below 10,000|
|TIER 1||TIER 2||TIER 3||TIER 4||TIER 5||TIER 6|
|Australia||Bahrain||Antigua and Barbuda||Albania||Angola||Afghanistan|
|Austria||France||Bahamas, The||Algeria||Armenia||American Samoa|
|Canada||United Kingdom||Cyprus||Barbados||Bolivia||Burkina Faso|
|Denmark||Czech Republic||Belarus||Bosnia and Herzegovina||Burundi|
|Germany||Equatorial Guinea||Bulgaria||El Salvador||Central African Republic|
|Japan||Italy||Costa Rica||Guatemala||Congo, Dem. Rep.|
|Luxembourg||Latvia||Dominican Republic||India||Côte d'Ivoire|
|Saudi Arabia||New Zealand||Iraq||Mongolia||Guinea-Bissau|
|United Arab Emirates||Russia||Mauritius||Paraguay||Kiribati|
|Trinidad and Tobago||Peru||Tonga||Madagascar|
|Saint Vincent and the Grenadines||Uzbekistan||Marshall Islands|
|South Africa||Micronesia, Fed. Sts.|
|Uruguay||Papua New Guinea|
|São Tomé and Príncipe|
|West Bank and Gaza|