Ethical AI for Instructors

An article in the New York Times caught my eye yesterday. The Professors Are Using ChatGPT, and Some Students Aren’t Happy About It read the title. For the past six weeks, I have been coding AI integrations into Innovation. It caught my eye because I have been thinking a lot about AI in education. From the perspective of a teacher, it drives me crazy when my students submit ChatGPT-generated work and pass it off as their own. The cartwheels I have to do as a remote instructor to prevent this are pretty byzantine!

But I am also interested as a businessman. I aim to enliven innovation (and raise its notoriety) by the integration of OpenAI in every aspect of the site. During this feverish coding period since mid-April when we got our API key, I have coded apps that…

  • generate multiple-choice questions for tests, reading comprehension, videos, and Jeopardy-style games;
  • score short answer responses based on guidelines and model answers;
  • score longer essays based on rubrics and instructor-designed guidelines;
  • interact with students in online forum discussions;
  • generate composition topics and dictée practices for world language teachers;
  • generate custom grammar exercises for world language instructors.

Through the summer, I plan to add some sophisticated AI analysis options for student essays as well as rubric generators and a monitored chat.

In the New York Times article, student Ella Stapleton was a senior at Northeastern University. Her professor had used ChatGPT to generate lecture notes and failed to remove the telltale signs of its origin. Another student found that the comments a professor left on one of her assignments included the chat with an AI to help grade it. One student is suing her university, saying she was paying for instruction from the prof and not from an AI. Are they right to be annoyed?

Readers are no doubt familiar with the Talmud, a central work of Jewish thought composed of rabbinic debates spanning centuries. These debates often wrestle with how to interpret and apply biblical law to real or hypothetical situations. A hallmark of Talmudic reasoning is the use of analogy: to what extent does a current case resemble one already discussed and resolved?

This is the approach I would like to take in arguing specific ethical considerations regarding the use of AI by instructors. I began teaching in 1991. If we assume ethical principles to be fairly static, since right and wrong should probably not really change much, then what was right then is still right now.

In 1991, a public school teacher would have access to a commercially published textbook. This would typically come with a package of pre-made tests and answer keys, workbooks for subject-specific practice, maybe filmstrips or posters, and so forth. It was the common understanding that teachers were not expected to write their own textbooks or even design every one of their own lesson activities.

In 1991, a college professor would typically teach using a commercially published textbook selected for the course. Along with the textbook came instructor guides, test banks, lecture slides, and other supplemental materials provided by the publisher. Professors might adapt these resources, but it was generally understood that they were not expected to create every reading, assignment, or exam from scratch. The role of the professor centered more on guiding discussion, delivering lectures, and evaluating student work than on developing entirely original curricula for each course.

With regard to assessment, my teachers in the 1970s in my grammar school sometimes used a Scantron machine to score those tests where you fill in the bubble. They did not score the tests all by hand. My elementary classes were 35-40 kids to a class in a parochial inner-city school.

When I was teaching social studies here in New York State just before I retired, I was called upon each June to drive far away to meet with colleagues from other districts to score the essay portions of the New York State Regents exams. Two teachers graded each paper and we discussed the merits and the score.

In 1991, assessment at the college level often meant midterms, finals, and a handful of major papers or projects. In large lecture courses, teaching assistants might handle the grading of essays, quizzes, or lab reports, following rubrics or guidelines set by the professor. While professors were ultimately responsible for student evaluation, it was common for them to delegate portions of the grading process, especially in high-enrollment classes. The expectation wasn’t that every piece of student work would receive personalized feedback from the lead instructor, but rather that grading would be efficient, consistent, and scalable.

Returning to the students who are upset with their professors for using AI to generate lecture notes or to generate student evaluations, I think we can reason by analogy as did those Talmudic scholars in times past to ascertain what is right.

My premise, and this is after many hours of working with AI over a year or more, is that at this particular moment in history, the best AI has to offer is to be a rather naive, but sometimes insightful, young assistant. My teachers reviewed the commercially published tests and checked for typos and accurate keys. My professors supervised their teaching assistants, providing them guidelines and checking their work. My AI helpers, who at the moment are ChatGPT and Gemini, need guidance and supervision by me.

Commercially published textbooks, tests, workbooks, worksheets, and the like have been acceptable and welcomed for a century. No one would have asked the one room schoolhouse teacher to publish her own grammar books. No one would have faulted a full professor for having his assistant grade lab reports. In 1991, and this is before the demands of differentiating instruction, the teacher was the creative director of a plan to educate using resources that they had vetted and sometimes using assistants that they supervised. At the time, this arrangement was both normal and uncontroversial.

The introduction of AI as a source of learning or an assessment tool doesn’t diminish the instructor’s crucial role; it amplifies it in the same way a carpenters’ work was amplified by the invention of the nail gun. Just as educators have always been responsible for the quality and integrity of their classrooms, they must now extend that vigilance to AI. This active supervision ensures that AI enhances, rather than supplants, sound pedagogical practices.

Innovation has built all of its AI integrations around a clear philosophy: the instructor remains the expert in the loop. When AI generates test questions, they must be approved by the instructor before being added to an assessment. When AI scores an essay, the instructor sets the rubric, defines the guidelines, and reviews the results before incorporating any of them into the student’s grade. When AI participates in student discussions, it does so within parameters the instructor has defined — including tone, context, and purpose — and under active supervision. When AI grades short-answer responses, it relies on model answers the instructor has already selected and endorsed.

At every turn, Innovation’s workflow puts the instructor in the role of guide and gatekeeper — promoting good old-fashioned professional oversight through the design itself.

The profs who failed to properly read and edit the course materials or assessment comments are to be chided for editing poorly. But the expectation that students have that instructors be the author of all of their course materials is born of an age when technology makes this at least theoretically possible, although not practically so. The expectation that no assessment will be outside the hand of the instructor is a new fashion, also imagined in a context of hyper-alertness to AI usage. One professor noted in the article was criticized by the student for chatting with the AI about writing the critique of the student’s work. But this is precisely what a professor might do with a live assistant in days gone by! The difference is that the student of the past would have no knowledge of the discussion.

One of my remote students this year had nothing good to say about one of her teachers. She cited the example of the fact that her teacher got her powerPoint slide shows from ChatGPT. If that powerPoint were of poor quality or included incorrect information, I could agree. Where this student goes wrong is in thinking that the general notion of getting learning resources elsewhere is illegitimate or unprecedented. The wrong would be in presenting shoddy or incorrect information, not in failing to be the author of everything.

Updating our Terms of Service for the AI Integration Rollout

We are excited to be rolling out our massive upgrade to AI this month! Already, subscribers will notice the little purple buttons all over the site controls offering AI assistance with test question generation, grading student work, and tasks specifically geared toward teaching modern languages.

Subscribers will be invited to agree to the new terms of service when everything is up and running in June. Here is the text of that change:

AI-ENABLED FEATURES AND RESPONSIBILITIES

Innovation Assessments LLC now provides access to a variety of artificial intelligence (AI)–powered tools to enhance educational services. These may include, but are not limited to, automated test question generation, grading support for essays and short answers, rubric design assistance, writing prompt creation, and supervised chat-based discussion with AI for students. Users should be aware that AI-generated content may contain inaccuracies or reflect inherent biases, and human oversight is crucial.

Use of AI services is subject to the following terms:

  • Students may only access AI-powered chat or discussion tools under licensing of their teacher, and only if the teacher has enabled this feature for their activity.
  • All AI-generated content is provided “as-is” and may require human review. Teachers are responsible for reviewing all materials prior to use in assessments or instruction.
  • Essay grading by AI is advisory in nature. Final evaluation remains at the discretion of the teacher or institution.
  • Teachers and students may not use the platform’s AI features to submit or generate content that is harmful, discriminatory, or in violation of academic integrity policies (which include, but are not limited to, plagiarism and unauthorized assistance).
  • Innovation Assessments LLC reserves the right to monitor, restrict, or disable AI usage in cases of misuse, abuse, or usage patterns that negatively impact the platform’s performance or other users.

Innovation Assessments LLC will handle data generated through AI features in accordance with its Privacy Policy. Innovation Assessments LLC may update or modify its AI-powered features and functionalities over time. By using any AI-related features, you acknowledge the limitations of current AI technology and agree not to rely solely on AI-generated outputs for high-stakes educational decisions.

AI TOKEN USAGE

Access to AI features is governed by a monthly token system. Each account tier includes a set number of AI tokens per month, which may be used for supported features such as question generation, AI chat, grading support, and other automated tools. Tokens renew every 30 days from the date of paid subscription.

  • AI tokens do not roll over. Unused tokens expire at the end of each 30-day cycle.
  • Users may purchase additional token bundles if their monthly allotment is exhausted before renewal.
  • Token usage is calculated by the AI company and may vary based on the feature and the amount of text processed in both the request and the response. Requests with more text will consume more tokens, as will more detailed or lengthy AI-generated content. Higher-cost actions (e.g., full essay scoring) consume more tokens.
  • Token balances and consumption details are available to account administrators within the platform dashboard.
  • It is the user’s responsibility to monitor token use and purchase additional tokens as needed.

Innovation Assessments LLC reserves the right to modify token costs, tier allowances, or features covered by tokens with notice. Abuse of the token system may result in service restrictions or termination.

✨ Make a Test with Innovation’s AI Question Generator

We are working feverishly to integrate openAI into Innovation this month! It is so exciting to see how this enhances our work! It’s like having a professional teaching assistant!

The new Test app has many enhancements over the previous app. Among the improvements (besides AI support) are:

  • It can have a mix of short answer and multiple-choice questions.
  • Teachers can import questions from the test bank, from other tests, create brand new questions, or use our AI question generator.
  • Teachers can edit the questions right in the new editor app, including attaching audio clips and images if needed.
  • The student test area has a new, modern layout and user-friendly design, including full security, support for international characters, and a feature to mark questions for later review.

For short answer questions, the teachers can avail themselves of the AI grading assistant! While we have left the algorithmic AI installed, using generative AI saves you the trouble of pre-training the grading assistant.

Check out the new AI integrations!

✨ Introducing OpenAI – Innovation Integration!

Innovation is excited to announce that we are working diligently to integrate OpenAI artificial intelligence into all aspects of the Innovation platform! We are developing a virtual teaching assistant that will help teachers generate “just right” learning activities and assessments.

Look for the purple button throughout the site …

… or the ✨ emoji to signal where AI integrations have been installed.

AI integration has been a goal at Innovation since before the release of the generative AI models in the early 2020s. Since 2018, Innovation has sported a pretty nifty algorithmic AI that helps grade short written work submissions and proctor student online activity. Now with the integration of OpenAI’s generative AI models (3.5 and 4o), we can truly realize the dream of a highly productive and efficient virtual teaching assistant!

When Innovation started out under a different name some 25 years ago, it was mainly a test generator (thus, the name). Since the pandemic, it has been meeting the needs of remote instructors and in-person classrooms alike not only in assessment but in content delivery in a variety of subjects, including for world language instruction.

Innovation is a place to create. It’s a place where teacher-authors can generate “just right” learning activities and assessments for their teaching context instead of textbook company generics. But we understand that secondary school teachers are even more busy than ever, seeking to meet the growing diversity of need and ever changing objectives and curricula. With Innovation’s AI teaching assistant, teachers can redirect their creative energies to the big picture of student learning and measurement.

Check out this simple how-to, for example, which illustrates how you can use the AI teaching assistant to help you generate a test.

Check back with us as the spring unfolds! At this writing, we have already installed openAI teaching assistants in:

  • Test generator (multiple-choice or short answer)
  • Études (for adding questions to PDF or video)
  • World language apps:
    • composition generator
    • composition assessment
    • grammar workspace
    • translation drag and drop “Scramblation”
  • Essay grading
  • Vocabulary flashcards and assessment

Discouraging Over-use of Translators in Online World Language Classes, Part 1

AI assistance and translators such as Deepl and Google Translate are very accurate and useful tools. When I assign my French students certain tasks, I expect they will use these tools to help them just as I would have expected students in-person thirty years ago to use a French-English dictionary to help with spelling and new words on certain tasks.

The problem is that the temptation to just have the AI generate the work is a strong one. It is important for remote instructors to place obstacles in the way of this practice, which not only undermines the student’s training but represents an ethical pitfall.

Imagine this scenario: an online AP Spanish class where major assessments are take-home tasks like essays and video-recordings of presentations. Using the traditional paradigm for this assignment, the student is given guidelines and due dates and a rubric with a graphic organizer. The instructor provides all that. Then the due date comes, all the work is in, and the instructor begins to review the work. What impressive vocabulary! What elegant grammar! And yet, reflecting on the spontaneous language generated by these same students in video-conference live sessions, it is hard to believe that this work could come from some of them.

All of my remote courses begin with a training film of sorts in which I explain the concept of academic integrity and ownership of one’s work submissions. I explain that it is expected that students will learn all of the new words they incorporate into their work submissions so as to maintain ownership of the task. I demonstrate using a translator properly and improperly.

A very useful strategy is one I have used since in-person days decades ago: simply ask the student the meanings of the words in their work that I suspect they do not know. In the remote learning context, this can be difficult to arrange, since there is no easy way to pull a student aside during class and conduct the interview about their work. That’s where Innovation comes in.

Sample student work that was too perfect for their demonstrated abilities.

It only takes a few minutes to select words and phrases from the student’s work submission that I believe they do not likely know. I select seven to ten words or phrases and I generate a short answer translation quiz using Innovation’s Quick Short Answer.

I enter a title, maybe set the category, and enter the words with English first, an equal sign, then the French.

Innovation’s app separates the word from its meaning by the equal signs. Once I have generated the quiz, I access the quiz Master app. I set the time limit to 1 minute for 7-10 words and I turn on the high security.

With the high security on, the assessment will submit and lock the student out if the student leaves the window to click on something else. Only the teacher can re-admit student to the quiz. The window resizes to full screen when the student starts the task and if they resize it smaller, the proctor records it. The proctor also records start time, how much time spent on the questions, whether text was pasted, and more.

As a final step, I lock the quiz up to only certain access codes. This allows control of how many times a student can restart the task. Simply select the Task dropdown from the playlist and then select Lock. Instructors can view the access codes from the Task dropdown or can generate one key by clicking the One Key button next to the title.

Sometimes, I will ask that the local facilitator proctor the student during the quiz so that they cannot look up the words on their own device.

So now what does one do with the results? When first introducing this strategy to students, I explain that it will not affect their grade “this time” and that it is a good reminder to make sure students have full “ownership” of their work. I may randomly select students for this verification, or if it’s a small class I may include it as a portion of their grade for a task and send one to everyone.

It’s not necessary for a student to get 100%. I usually take the quiz first to test it out; to see how many I can do in 1 minute. Even if they do not get 100%, I can learn a lot from their responses. For example, one student got 44% right of 8 and did so by skipping around. I interpret the skipped words as ones she forgot and intended to get back to later. Another student only got 33%. I interpret that as definitely being a sign that his work submission had too many looked-up words he did not know. I let him off with a warning this time and a reminder about academic integrity and ownership.

I once had the experience of taking over a class part way through the year. No structures had been in place to discourage inappropriate use of AI. The grades were all outrageously good. Some students were rarely in attendance and only handed in work that was graded. They did this work using AI, so it was no real effort. This really is a terribly corrupt system, especially given that there are students in nearby schools taking in-person classes who have to really do the work for their marks. There are honest students with good attendance who have lower grades for their honesty. It was an AP level course. Now, you might argue that the students would not possibly be ready for the AP exam if they took the course this way. One would think that would deter them from cheating. But upon reflection, it’s clear that having a 98 in an AP class on one’s transcript, even if one only scored 2 on the exam, could be valuable for college admissions considerations. So, no, it does not deter them.

Remote learning has enormous potential. I have great confidence in it. We instructors, we need to learn how to maintain the same standards as we had during in-person sessions. We cannot allow a situation to arise such that students in remote classes can just become pass-through vehicles for AI translators that do all their work. That situation would become a sort of scam. In part 2 of this topic, I will present a strategy for teaching composition in this new world of AI-assisted homework.

Interactive Activities at Innovation

Those of us who are teaching remotely are starved for interactive apps that let us engage our students beyond screen sharing! Innovation is constantly adding apps and modifications to meet those needs.

Live Sessions

“Live sessions” are interactive sessions that student “join” through the Innovation platform.

Multiple-choice, short answer, and media activity types can all be transformed into live sessions! Just select Live Session from the Create dropdown by your activity in the course playlist. Click Live Link and copy the URL. Send to students in, for example, the Zoom or Teams chat.

Once they join, the teacher host can present one question at a time and await student responses.

Once students respond, teacher is notified and can debrief by displaying responses anonymously.

During the media live session, the teacher presents a slideshow and periodically opens the system for responses, poses a question, and awaits replies.

Activity Monitoring

During composition writing, grammar activities, short answer, and Etude tasks, the teacher can activate the Monitor app. This is found in the Task dropdown for the activity in the playlist. As students work on the task, instructors can view their progress with a minimal time delay. Read more here.

Teachers can hide the student names and the correct answers so they can share the screen with students as they work.

Asynchronous Forums and Synchronous Chat

Read more about moderated synchronous chat here and here

Read more about asynchronous chat here

Give Students Quick Access

Innovation prides itself on the flexibility to plug in to any learning management system and to be easily integrated in video-conference remote lessons.

From the course playlist, you can send students a link to an activity by clicking the link icon on the right . Paste the link into the video conferencing chat window.

Send students a link to the assessment debriefing (the student’s assessment and correct answers to the task) using the icon below that.

From within the Live Sessions, the same functionality exists.

Teaching Remotely Can Be A Chess Game …

Luckily, there’s Innovation!

Pawn to queen 3… Knight to bishop 3… Ugh!

I don’t know what metaphor best describes testing online, “arms race” or “chess match”. The frog in the slowly boiling water is another metaphor for this, but I’ll get to that later.

My young friends across the nation in my remote classes are digital natives. They know the schtick. The know that most applications people use to teach were designed for office workers and that the kind of monitoring and controls that we expect when teaching high school are just not included nor generally welcome by the cubicle crowd.

In one school, I am to use Canvas. Student has an essay due at a certain time. They compose in Google Docs and paste a share link in the Canvas assignment submission application. This way, they have something in on time and won’t be penalized if they are still working on it after deadline because it’s a Google Doc still in their custody. Wow. That’s clever.

In one school, I am using Innovation plugged in to Canvas. When the security triggered and locked the student out of a test, he complained that he “just had a question for his teacher” and was guilty of nothing more than clicking on Zoom to ask. 🙄

I have inherited one AP French class from another instructor who did not maintain the kind of guardrails that I would have done. Students were basically able to paste in AI- or translator-generated responses to assignments. They had wonderful grades! All above 98! When I met them and asked for improvised conversation in French, they mixed up the words for 16 and 60… one asked his classmate to tell him what to say.

When I first start working with a remote class, I don’t activate all the security. Let’s face it, security online is necessary, but adds extra steps to things that are often annoying. Two-factor authentication, waiting for a confirmation email from a site to validate an email address, proving one is not a robot by clicking on all the pictures of bikes, using some third party app to authenticate us… I could go on. This stuff is annoying and time consuming. So to start, I don’t activate it on my remote assignments.

But then I may start seeing language proficiency well beyond the typical means of French class students. I see compositions that always end in “En conclusion…”. I see lots of “pasted text” records in the proctor logs. Now I turn up the heat.

The metaphor I might use here is the frog in the beaker of water in the science lab where they are turning up the bunsen burner so slowly that the frog doesn’t notice it. 😏

Using Innovation’s extensive student activity logging features, I can email the student to tell him he needs to use his second chance on the test because the proctor logs recorded he left the page 12 times and pasted text 3 times. 🙄

Using Innovation’s locking app, I can restrict access to the test after the due date to single-use codes that students must request in order to get in to use their second try privilege. Innovation’s locking app will allow me next month to restrict access from the start to single-use codes so that I micro-manage student access even more closely.

I can use Innovation’s high security setting to have the short answer test app close up and lock if the student leaves the task during the testing period (like to view another window). They’ll have to contact me to re-admit them.

I can follow closely exactly what a student is doing in the proctor’s notes for short answer and composition. Did they paste in text? Delete their answer completely? Leave the page?

Anecdote: I had a student back my last year teaching full-time before I retired. He was in my 9th grade Global Studies class. He wanted to take a test again because he said the app deleted all his work! I re-coded the app to record when a student deletes all their work. Next time, I caught him. He was just deleting it all and claiming a software error didn’t save his work.

Do you know what one of the biggest challenges of teaching adolescents is? It is to learn not to take these antics personally. Like most adults, it is in their interest to get as much as they can out of life with the least investment of energy. Sadly, this often leads to strategies that are in the long-term self-defeating and that violate ethical norms. I’m there: I still like my kids even though they can engage in what my grandmother would have called “diveltry”.

Pawn to queen 3… Knight to bishop 3… I have been teaching since my current students’ parents were in elementary school. I have all the tools I need to meet move for move as my digitally-native young friends try to take shortcuts. Innovation helps me do that!

Checkmate! 😉

Activity Store Terms of Purchase

Effective Date: 7 January 2025

Registration

Account Registration: To make purchases, you must register for an account on the Innovation platform. Your account information will be used to manage your purchases and provide access to downloadable content.

Purchasing and Downloading

Copy to Question Banks: Upon completing your purchase, a copy of the purchased materials will be added to your personal test question bank within the Innovation platform. You will have full control over your copy, including the ability to edit or delete it.
Access in Dashboard: Purchased materials will also appear in the “Imports and Purchases” section of your account dashboard for future reference.

Ownership and Usage Rights

Ownership of Copies: Once added to your test question bank, the materials are yours to use, edit, and manage as needed for personal or educational purposes. However, the original content and intellectual property rights remain the exclusive property of Innovation Assessments LLC.
Restrictions: Redistribution, resale, or sharing of the original or modified materials outside the Innovation platform is strictly prohibited.

Content Updates

Updates and Changes: Updates to the original content may be released by Innovation Assessments LLC. However, these updates will not automatically apply to copies already in your question bank. You will retain full control over your personalized copies.

Limited-Time Download Availability

Temporary Download Links: If applicable, download links for materials will remain active for [e.g., 7 days] after purchase. Be sure to save your materials promptly. Extensions may be granted at the discretion of Innovation administrators.

Refund Policy

No Refunds: All sales are final. Due to the nature of digital products, refunds or exchanges are not offered once a purchase is completed and the content has been added to your account.

Platform-Specific Use

Exclusive Platform Use: Purchased materials are designed exclusively for use within the Innovation platform. Compatibility with third-party platforms is not guaranteed.

Technical and Account Responsibilities

Account Security: You are responsible for maintaining the confidentiality of your account credentials. Innovation Assessments LLC is not liable for unauthorized account access.
Technical Requirements: Ensure your system meets the requirements for accessing and managing content on the Innovation platform.

Disclaimers

Content Quality: While we strive for high-quality, accurate materials, we do not guarantee they will meet all individual user expectations.
Modifications: Users are encouraged to customize their copies of the materials, but Innovation Assessments LLC is not responsible for the quality or functionality of modified content.

Embedded Content from Third Parties

Third-Party Content: Some learning tasks may include embedded content, such as videos, provided by third-party platforms (e.g., video streaming services). Innovation does not control or guarantee access to these resources. What you are purchasing is the associated questions, learning tasks, and the tools provided by Innovation to integrate with and utilize such third-party content. Continued access to third-party content is subject to the terms and availability of the third-party provider.

Changes to Terms

Modification of Terms: We reserve the right to update these terms and conditions at any time. Changes will be effective immediately upon posting.

By completing a purchase, you agree to these terms and conditions. If you have questions or need assistance, contact our support team.

Terms and Conditions for PDF Purchases

4 January 2025

Registration

Account Registration: To make purchases, you must register for an account on our platform. Your account information will be used to manage your purchases and provide access to downloadable content.

Purchasing and Downloading

Download Link: After completing your purchase, a download link for the purchased PDF file(s) will be displayed on your screen. The link will also be saved in the “Imports and Purchases” section of your account dashboard for future access.

Limited-Time Availability: Download links will remain active for [e.g., 7 days] after the purchase date. Please download your files within this period. Extensions may be granted at the discretion of the store administrators upon request.

Content Updates

Updates and Changes: We reserve the right to update, modify, or remove PDF content at any time without prior notice. Updated versions of a purchased PDF may not be automatically available unless explicitly stated.

Usage and Restrictions

Intended Use: Purchased materials are intended for personal or educational use only. Redistribution, resale, or unauthorized sharing of downloaded files is strictly prohibited.

No Refunds: Due to the nature of digital products, refunds or exchanges are not provided once a purchase has been completed and the download link displayed.

Responsibility

Technical Issues: We are not responsible for technical issues, such as incompatible software or hardware, that prevent you from using downloaded files. Ensure your system has the necessary tools to open and view PDF files.

Account Security: You are responsible for maintaining the confidentiality of your account and password and for all activities under your account.

Disclaimers

Accuracy and Quality: While we strive to ensure the accuracy and quality of our PDF materials, we do not guarantee that every document will meet all user expectations.

Modifications to Terms

Changes to Terms: We reserve the right to modify these terms and conditions at any time. Changes will be effective immediately upon posting on this page.

By completing a purchase, you agree to these terms and conditions. If you have any questions or concerns, please contact our support team.

Introducing Innovation’s “DBQ Shop”

DBQ.

Document-Based Question.

If you teach high school social studies just about anywhere, working with primary source documents is likely a central feature to your course. This has been a major shift in teaching social studies in the past thirty years and it is a good one!

Retired now, I taught high school social studies for 18 of my 35 years on the job. Over that time, I developed a large collection of document-based tasks and training lessons to teach students to use them properly to reconstruct the likely past.

Most of my document-based tasks followed the structure of the New York State Regents Examinations for social studies: the enduring issue essay and constructed-response question (CRQ) in grades nine and ten and the short essay and civic literacy essay in US History grade 11. But incorporating one lengthy primary source text in each unit was also important to me and having students respond in a standard essay format was key for my units. I invite the reader to read more about using extended primary source tasks here.

In the DBQ shop, you will find document-based tasks that I edited and created. They are arranged by category. The sources are cited and this is especially important because students are supposed to consider the sources carefully. If you are interested in lessons for teaching students about primary sources, you may find something you like in the PowerPoint shop!

You need to be a subscriber to make purchases from the Innovation shops. Download links for purchased items are stored in your dashboard. The resources are all in PDF format. Please review the terms and conditions.