The easiest way to set up the program evaluation forms and pre/post assessment is using the my.siyli.org platform, which will email the participants directly and create the reports you need. For more info, please see the help section for my.siyli.
If you are not able to use my.siyli.org (e.g. if the language you are teaching in isn't available), you can set them up yourself. In this case, SIYLI is not able to support you with the logistics or data analysis.
Program evaluations at the end of the in-person SIY program can either be handed out on paper, or be sent electronically the following day. The questions that SIYLI uses is in this Excel sheet, which you are welcome to use-- you can print from this file directly, or use this to set up an electronic survey (e.g. through Google forms, Survey Monkey or something else). If you use paper forms, you'll likely get a higher response rate, but you will want to input that data in a way that you can share it with your client.
We only use this for the 2-day SIY program (not shorter programs), and we send out the pre-assessment 1 week before the program, and the post-assessment after the capstone webinar (~4 weeks after the in-person program). The questions we use are here.
The analysis of the pre/post data that SIYLI does is complex--to create a simplified version, you might want to just look at an aggregated average score change for the whole group vs. trying to set it up to recognize individual participants and measure each person's pre/post score.
You do not need to send the information to SIYLI. We don't expect to receive data that isn't coming through the my.siyli platform, simply because we do not have the bandwidth to upload and analyze it with the data we collect.
Is SIYLI's Pre/Post Assessment Validated?
The pre/post assessment is not a validated instrument, which prevents us from claiming that we can definitively say that the results are bulletproof and that the SIY program is the primary cause for benefits seen in the results. That said, we do have the benefit of over 1,500 data points showing positive trends in all of the assessment items and we often speak to the data in that way: that we are able to identify trends and directionality of program results. Also, a fair number of survey items have been taken from other validated assessments (though we recognize that they lose validity when taken out of the validated assessment). Nevertheless, many items have been tested for making sure they are not leading, confusing, and double-barreled questions.