Teamable Analytics Prod:
The live production build of Teamable Analytics, accessible through Canvas. We refer to this as ta-prod.
Teamable Analytics Verf:
The staging version of Teamable Analytics, accessible through CanvasTest. We refer to this as ta-verf.
Note: We denote Teamable Analytics as
TAin further documentation
Canvas: The production or "real" Canvas.
CanvasTest: A separate instance of Canvas. It contains a dump of the real Canvas database that is updated monthly.
You can get there with this link - https://ubc.test.instructure.com/
Before the process begins, the prerequisites are:
- A course with
TAimplemented. (This needs to be done by CTLT)- This course needs to have students (test Canvas users)
- Ideally the course also has sections with students divided evenly among them.
- The impending threat of a demo on your back 👀.
After that is complete, the steps are as follows:
- Consolidate changes.
- Deploy changes and ensure stability.
- Data setup.
- Test run.
- Complete demo.
Any changes needed for the demo will need to be completed and merged to the master branch. It goes without saying that
these should be tested and validated to avoid complications during the demo.
Specifically, the current process is (a) merge to master, (b) deploy to ta-verf, then (c) deploy to the ta-prod.
Currently, only CTLT members can deploy to ta-verf/ta-prod.
- Ensure
ta-verfis stable and running. - Ensure CanvasTest is connected to
ta-verf.Whenever CanvasTest refreshes data from Canvas, the connections to
TAare broken. Since Canvas connects tota-prod, when the refresh happens, CanvasTest will now try to connect tota-prod. This isn't allowed so usually the links from CanvasTest toTAwill just disappear entirely. This needs to be manually reconnected by a CTLT member.You can tell this is the problem if links to
TAfrom courses it is implemented in do not work or are not there. - Ensure connections are working.
Sometimes after reconnecting CanvasTest to
ta-verf, opening the app and using features that integrate with the Canvas API (importing, pulling, pushing) will cause errors. This is another problem solved by CTLT.
Now we're ready to set up the data. There are a few types:
- Student data
- Assignment data
- Survey data
- Peer evaluation data
If you're using a course that has been used for a demo before, then the data setup might have been completed already. Be sure to remove extraneous DB
The student data comes from Canvas/CanvasTest and is completed by clicking the "Import Course Data" button in
<COURSE_ID>/sections. As long as the data exists in the connected Canvas instance, this will work
(CanvasTest for ta-verf and Canvas for ta-prod).
Remember that CanvasTest refreshes with the data from Canvas monthly on pre-assigned dates that CTLT will know (so we can plan in advance), but this refresh cannot be manually triggered.
Note: Student accounts cannot log in to CanvasTest at all. So peer evaluations and surveys cannot be mock-completed in CanvasTest/
ta-verf.
In order to demonstrate the "Monitor Teams" functionality, assignment grades are needed. This means creating assignments/quizzes in CanvasTest and giving each student a grade in the gradebook for them.
The easiest way I could do this was to make a "Graded Quiz" with 1 question that's an "essay question". You can set any number of total marks for an essay question, so choose a number that makes sense. Then go to the gradebook and give each student a mark. (To be extra thorough, I generated a set of normally distributed integers of the class size)
- Create attributes.
- Import one of each attribute bank attributes.
- Perhaps make an extra 1 or 2 so you can make meaningful project requirements later.
- Create a project preference attribute based on a project set you prepared earlier.
The project set should have a few projects that make sense, with corresponding requirements that make sense. Set the number of teams that can work on each project so that they add up to a total number of teams in the project set is a desirable number.
- Create a survey from all of these attributes. The order isn't important, but ideally the order should make sense.
- Publish this survey and preview it on Canvas.
- Create randomized fake responses.
Someone with superuser access to the application needs to go to
/<COURSE_ID>/surveysand click the dark, orange-outlined, button next to the survey to generate random responses for that survey. - Create a WeightAlgorithm group set using this survey data.
Any settings here are fine. Creating this based on the created project set is a good idea, but you could just set a hardcoded number of teams.
- Create peer evaluation attributes.
Create one of each type, each made to make sense.
- Create a peer evaluation with these attributes.
The group set selected for this doesn't have to be the group set you published.
- Start date should be current date or earlier, close date should be after the demo date.
- Set feedback to be viewable, but make one of the attributes have no feedback available.
- Preview as a few students to ensure things work as intended.
Before starting these steps, navigate to CanvasTest with this link: https://ubc.test.instructure.com/
After logging into CanvasTest, you'll need to navigate to a course that has TA enabled. For us, we have a course named "tfdemo" (https://ubc.test.instructure.com/courses/31084) that we use for demo purposes. You access ta-verf by clicking the "Team Formation
Run through the Example Demo Guide.
Run through the Example Demo Guide again, with any modifications you need to make.