The purpose of this article is to guide manager users through the process of creating Survey Intelligence Report Designs. SI Designs are created using a 7-step set-up that allows for various filtering, aggregating, ordering, layout/display options, and saving/publishing features. It is recommended that users have a clear goal in mind when designing reports so they can enjoy the feature’s full potential.

  • Create or Select a Design
  • Set the filters
  • Select the surveys
  • Choose the questions
  • Setup data aggregations
  • Design the layout
  • Save the design
  • View the report, share the report, save to PDF, or export the data to Excel 


  • Select Add New.


  • What are Filters? Filters help customize the design of the report. 
  • Choose from the various databases (if applicable), survey attributes (period, type, focus), course attributes (number, type, year, department), people attributes (username, last name, roles), and rotation attributes (blocks and sites/groups)
  • Free-text search features help to narrow down results:
    • Minimum of 2 characters is required to search
    • The Add button allows you to look for more than one item at the same time!
    • All matches are case insensitive and do partial matching. For example, if you enter "bio" for the course number the search will find BIO 101, 101 Biol, etc.
  • Filters will affect the Survey Administrations to choose from and also the data rows that appear on the report.


  • The survey selection page where all survey administrations fitting the filter criteria selected in step 1 are displayed according to type, period, and status


  • Allows users to select specific questions. Course and people-based questions will show separately. It is important to choose questions thoughtfully for aggregation and comparisons. Course Evaluations will combine “duplicate” looking questions (same text and responses). Hover over the survey count to see the Survey Administrations where the question appears to help you choose the questions.

Note: Hovering over the scale will reveal the response set with values assigned; The survey count shows the number of surveys the question is attached to and hovering over the number will reveal which surveys they are... If there are more than 10 surveys associated with a question, the hover box will show the year and period instead of the survey titles.


  • Contains the aggregations and ordering options in a drag-and-drop platform.  Report data will be aggregated according to the choices moved to the right and in the order they appear from top to bottom.
    NOTE - If users don't have results access to people based questions the Evaluated Individual aggregation may not work for them as people-based questions may not qualify. 

Report Options 

  • Allows users to set reporting options including title, layout, graphing, score scales, question grouping, response counts, advanced stats, heat map, and removed questions (selected questions that had no responses found for the data population).
    • Freeze Headers: Headers in the reports can be pinned when there are large data sets to scroll through.
    • Leveled Scores: We love the 100 point score scale for the overall leveled score!  This is really useful if there are mixed likert scales (e.g. 4 pt. and 5 pt. scales) to come up with a single score… looks a lot like a report-card!
    • Frequency Distribution: Hover over the mean or group median to see the frequency distribution. 
    • Heat Map: This option helps to show scoring patterns using color. When selected, the Mean and Group Median columns are colorized based on how close the value is to the column mean. 

This feature works best if only including the Mean or Group Median columns as the color patterns will be easier to see because the colors for each question will appear next to each other. 

Sorting the column can provide visual insight on particular questions. Options exist for colorization choices, including an option for color blind individuals.

NOTE: Precise values are used for comparisons (exactly like on the Evaluation Report) so, what seems to be the same value may appear with a different color in the column. This is by design. See the example below:  


  • Decimal precision: If you are only seeing one decimal, consider changing your reporting strategy to 2 decimals. Go to the Survey menu, then Survey Site Setup, and select 2 decimals for Mean and Group Median. This will affect Evaluation and SI reports.

Run Report

  • Aids users in determining the report design title and various publishing options.  
  • Users may also view the design immediately, export it to a .csv file, and set an expiration date if publishing publicly (super users only!)

NOTE: Only one public and one Course Evaluations Response Portal report may exist at a time. If an expiration date is reached, the user may extend it or re-create a new report to display.

Save Settings

  • Users may keep a design private, share with other CE Managers, or share with Course Evaluations Response Portal users (faculty, administrators, and/or students if applicable).

NOTE: Results Access rights will apply for designs shared with Course Evaluations Response Portal Users. While the basic filters chosen for the design will be available, only data an individual has access to based upon their role (faculty or administrator) will display in the report.

Sharing Designs

  • Once a design is created, Course Evaluations Super Users may share them (Step 7) or keep them private. Only the creator of design can change the design. Only the creator or a super user can delete a design.

Video - Survey Intelligence - Managers


Related Articles

SI Report - Faculty Users

Publishing Results

Survey Intelligence Report - Main Features