ReVIEW Talent Feedback System provides instructional design, cybersecurity training, leadership coaching, and a platform for data driven growth and evaluation.
Instructional Design
Teachers Love our Tools
Video Based Calibration
Increasing Evaluator Capacity
Comprehensive Talent Platform
Improving Instructional Capacity
Flexible Forms
Adapt to any Environment
Flexible Frameworks
No Evaluation Plan is the Same
Effective Reports
Feedback that Drives Growth
Program Evaluation
Strengthen the quality of your RFP response by including third party program evalaution.
Framework Understanding
We need to connect observed evidence to appropriate indicators. We have research validateds program tools that measure alignment to any framework or rubric.
Qualitative and Quantitative Feedback
Our program evaluators use a preponderance of both qualitative and quantitative evidence aligned to program goals, objevtives, and indicators.
Curriculum, Instruction, and Assessment
Instructional capacity increases when teachers get actionable feedback based on best pedagogical practices.Our porgram evaluators help you grow through the Claim, Connect, Action method to ensure trainers focus on meaningful student engagement and learning.
Supportive Yet Critical Feedback
When Principal Investigators and program managers receive feedback for growth directly connected to your indicators instructional capacity grows. The ReVIEW Talent Feedback System puts the focus on coaching and development not tagging of evidence.
An Awareness of Bias
We all have different lenses. The ReVIEW Talent Feedback System helps Principal Investigators and program managers recognize their bias and provide the most objective feedback possible.
Clear Communication
Our system trains teachers to write effective feedback reports that are evidence driven. Using the Claim, Connect, Action method Principal Investigators and program managers learn to make claims based on a framework, connect that claim to observed evidence, and then recommend actionable feedback.
Contact Us
Want to Win more Money? Contact Us.
Instructional Design
Designing for Reflective Growth
We train our team on using Pragmatic Instructional Design (McVerry, 2019).Pragmatic Instructional Design (PID) deploys a seven step process to ensure the highest course quality.
Develop Goals and Concepts
Choose Instructional Design Methods
Identify Course Taxonomy
Create Measures of Growth
Draft Course Content
Content Validity Study
Embed Best Practices
Our instructional designers apply the latest, and more importantly, the most lasting research in cognitive science.
Learning Tools
We begin with our Five by Five method. What must the learner do in five months and still understand in five years?.
Instructional Design Methods
Pragmatic Instructional Design takes a dialectical approach to adopting multiple perspectives. Meaning opposing ideas can both be true at the same time. There is no one method of instructional design that is better than all the others.
So instead we evaluate our newly formed goals against the context of learning and choose a primary and secondary instructional design theory to guide our development work.
For our cybersecurity maturity model certifiication courses, for example, we chose to use Understanding by Design and Multimedia Learning Theory.
Identify Course Taxonomy
Next our instructional designers meet with our your subject matter experts to create a knowledge and skills tree.
We examine the test objectives and break the body of knowledge down into what a learner needs to know and what a learner needs to do. We recommend this exercise, from Understanding by Design, to all companies where we provide training support.
Take learning to ride a bike. You need to know what a handbrake is (knowledge) and you need to apply various pressure at different speeds (skills). Learning how breaks work requires both. Learning to ride a bike requires learning breaks.
Cognitive scientists break knowledge down into three buckets: declarative (knowing what), procedural (knowing how) and conditional (knowing win). Creating cybersecurity courses requires us to assess across all three.
Artifact Upload
Once we know the objectives we switch gears to the assessment. See most people build the class first and then the test. This is backwards...well really it is not backwards. In Understanding By Design methodology you start with the end in mind.
Now that you know where you want to see knowledge growth in your students you have to build a tool to elicit evidence of the growth.
This requires us to develop performance tasks aligned to our objectives and key goals. We also want to apply authentic assessment contexts and this often requires going beyond multiple choice or essay questions.
Design Learning Activities
In the next step of pragmatic instructional design we create a scaffold of activities that builds the skills and knowledge that we will measure on our performance assessment.
The content gets crafted by our instructional designers and our SMEs and we follow a simple navigation. Predictable design leads to improved learning performance. We follow a four phase model of explicit instruction, guided practice, independent practice and reflection.
Design Learning Activities
In the next step of pragmatic instructional design we create a scaffold of activities that builds the skills and knowledge that we will measure on our performance assessment.
The content gets crafted by our instructional designers and our SMEs and we follow a simple navigation. Predictable design leads to improved learning performance. We follow a four phase model of explicit instruction, guided practice, independent practice and reflection.
Content Validity Study
Once we finish the design we send the course out to an independent company to conduct a content validity study. Data Intelligence Technologies does not see the data and always makes the results public.
Our assessments also get rigorous checks of reliability.
You must demand high standards when choosing an LPP. If they can not prove, using industry standards that their course teaches and measures what is syas it will teach in a consistent and reliable way you need to walk away.
Validating tests requires a science. Data Intelligence Technologies uses the best psychometricians in the world to ensure our courses are both valid and reliable.
Pilot Class with Best Practices
We then pilot our class with a group of learners and we iterate on variables that we believe will enhance our pedagogical goal.
We also train our instructors on delivering high quality feedback. All learning derives from learner reflection and teacher feedback. At DIT we utilize the Claim Connect Action (Tepper and Flynn, 2019) method. In CCA our instructors make a claim about user performance against a course objective. They then connect this to evidence elicited from student performance and provide the learner with actionable feedback.]
Video Based Calibration
Ensuring the Capacity of Your Evaluators
Step One: Collecting Evidence
Evaluators in your district receive access to a video from our library. These videos have been normed against numerous frameworks across the country.
Step Two: Analyze the Evidence
Your district evaluators then score this video against your chosen framework.
Using the user friendly interface they can score at the indicator and domain level but also select a rating at the attribute level.
Step Three Coaches Provide Feedback
Then a coach from ReVIEW Talent Feedback System will score the evaluator’s submitted report. Our coaches provide your evaluators with a model of high quality written feedback teachers deserve.
Step Four: Create a Professional Development Plan
As a district leader you will have access to facilitator reports that give a snapshot at how well your evaluators support the instructional capacity of your teachers. You can also download all evaluator and facilitator reports and data.
Comprehensive Talent Platform
Providing Feedback to Improve Evaluations
ReVIEW Talent Management System provides school districts with the tools to make feedback the driver of capacity growth. We built the platform to make the act of reflective practice to seem natural and not an additonal requirement in an already busy day.
Multi-Layered Feedback
At ReVIEW we believe a noble path to improvement begins with feedback. This holds true from Central Office to our youngest students. The district administrator dashboard provides easy navigation to many tools. ReVIEW Talent Feedback System is organzed around schools (or any group of individuals such as departments). The District admin account has access to special features including:
Framework Management: Change the rubrics used to evaluate different positions in the district or Unversity.
Event Management: Use pre-built or customize your own forms and then schedule events such as observations, portfolio, or course completion.
School Management: Add schools, departmetns, and users. Access and download any form submitted by a user.
Group Management: Create groups of users such as first year employees, tenures, and non tenured. Then you can set different frequencies for observations based on these groups.
Export Form Fields: Download any response submitted to a form. You can select by school, user, form, and even by question.
Faciliatator Reports: If you want to provide feedback to observation reports or assessments written by department chairs or evaluators you can get a snapshot of how well your team delivers high quality feedback.
Report Management: Choose from over twenty different reports that you can access fromt the district to the individual user level.
Video Based Calibration: If you purchase access to our professional development library you can create inter-rater reliability tests. Furthermore you get access to the videos in yoru form builder allowing you to create video based professional development.
Help: Reach out to us.
Teacher and School Leader Evaluations
In ReVIEW Talent Feedback System users are organized by their roles and postions.
ReVIEW Talent Feedback System allows you to choose between Teacher, Facilitator, Evaluator, and School Admin. Each role increases in rights. Users can have multiple roles.
Teacher: Submits artifacts and can be evaluated. Does not evaluate anyone else.
Evaluator: Can evaluate a teacher.
Facilitator: Provides feedback to the evaluator
School Admin: Can access all completed forms in that school, can change evaluator and teacher assignments.
The system also comes pre-loaded with a library of normed videos that districts can use to provide professional developeent activitaties around calibration and inter-rater reliability. Evaluators complete the video based observation and then receive feedback from a district employee or a ReVIEW faciiitator.
Group Management
In ReVIEW Talent Feedback System the frequency of events and observations depends on group assignments. District Administrators can create unlimited groups and assign forms and frequencies based on group membership.
First you make a group. You can then assign a user to the group by access the school management window.
You can then select events for each group.
Flexible Forms
ReVIEW Talent Feedback System has a fully functional form builder you can utilize to in your organization. You can make goal setting forms, portfolios, surveys, even online classes for your users to complete.
Pre-Populated Forms
ReVIEW Talent Feedback System comes pre-populated with ove twenty forms. This includes goal setting mettings, self reflections, formal observations, mid year reports, and so much more. You can use these forms out of the box or edit the existing forms to suit your needs.The evnt names, such as teacher observations, service, or organization can be easily customized.
Making Forms
When creating a new form you get to choose its category. This is used to generate reports based on the responses captured in the database. You can also assign different forms based on the roles of the user.
Form Components
When building forms you can choose from over a dozen tyopes of components. This gives you the ability to design anything from an observation form to an online class.
Date of Observation: This field, when added, is mandatory by default, it will display the date of observation on the teacher and evaluator dashboard once a report is completed
Text This is a single line text box good for small text fields such as a name or school.
Text Area Users will get an expandable text box. This is good for long form paragraph entry.
Checkbox Here you can creat checkboxes where users can pick one or more choices.
Dropdown-Here you can create a menu of choices for your users where they select one. Click on the plus button to add options and then type in the response you want users to be able to select.
Scale Score: This will have teachers or evaluators choose a rating using your predetermined scale labels (0-4 for example),
Goal Setting: Users will have the ability to add goals and action steps to reach those goals.
Rubric-The rubric question allows you to choose one domain and indicator. If you have users utilizing different frameworks based on position you have to select a domain and indicator for each position.
Calendar- This allows a user to selecta data from a pop calendar. This is useful for setting meetings or noting when a task is completed. It does not get displayed on the dashboard like the date of observation component.
File Upload This allows users to share artifacts such as pictures and reflection. Video uploads are available for an additional charge to cover the cost of storage and bandwidth,
Overall Score What is a users overall score across all domains and indicators.
Label This is a read only component. You can include basic html. A great tool to help with navigation or when building online classes for professional development.
Domain Score The overall score on any given domain. If you have different frameworks for different positions you will have to select each domain.
TableBuild a table for your users to fill out. You select the number of columns and rows and then provide labels.
Flexible Frameworks
Build Rubrics For Learning
At ReVIEW Talent Feedback System we know no two organizations have the same evaluation plan. So we give you maximum flexibility when developing or using a framework of excellence with your team.
Choose your Scales
You first can choose your scale you will use as an organization. Our scales, ranging from 0-5, allow you to create holistic rubrics where you make an overall judgement to analytical rubrics where you score users on domains, scales, and attributes.
Flexible Frameworks
When editing a rubric you name your highest level domain, and assign a numerical value.
When editing a rubric you have the option of adding additional indicators and attributes as well.
The rubric creator built into ReVIEW Talent Feedback System even allows you to customize the descriptor language in at the attribute level. This allows you to build in key levers that signify growth in your users.
You can thenassign a different framework to positions in your organization. When you first deploy REVIEW Talent Feedback System you determine position names. Once you finish cutsomizing rubrics you can assign different frameworks to your users.
Effective Reports
Focus on Feedback
ReVIEW Talent Feedback System also has a robust report generation tool. Building and district leadership can use the data to drive systemic change.
The report management dashboard comes with over twenty options for reports. You have the ability to compare teacher, school, and distrcit averages. You can track the total number of oservations and forms complete.
Once the report options are selected visually appealing graphs and tables display the data you need.
ReVIEW Talent Feedback System also allows you to download the responses to any form into a csv spreadsheet. You can select forms by school, form, and even question. These responses are great to use for staff training.
Facilitator reports are also available for organizations using the ReVision Learning Survisory Continuum. These provide a snapshot into the quality of your team and can be used to track progress as we focus on delivering feedback for growth.