🎧 Recording for Upload (Using Chromebook or Chrome Browser)

If you can’t use the direct browser recording feature, you can record your conversation using your Chromebook’s built-in tools (like Voice Recorder or Memos) or a free online recording tool, and then upload the file.


Part 1: Record Your Audio File

  1. Open Your Recording Tool:
    • Chromebook App: Open the Voice Recorder app or Memos app on your Chromebook.
    • Online Tool: Open a new browser tab and navigate to a simple, free online voice recorder (e.g., searching “online voice recorder” will give several options).
  2. Start Recording: When you are ready, start the timer or recording app and give your responses to the prompts in the conversation task.
  3. Stop and Save: Stop the recording once complete. Look for a Save, Download, or Export button.
  4. Name the File: If the tool prompts you, give your file a clear name (e.g., MyConvoTask.mp3). This makes it much easier to find later!

Part 2: Locating the Saved File (The Hard Part!)

Once you click Save or Download, the file goes to your local storage, most commonly the Downloads folder.

1. Using the Chrome Downloads Bar

  • After the file saves, you should see a small bar at the bottom of your Chrome browser window showing the file name.
  • Click the small arrow (⯆) next to the file name.
  • Select “Show in folder” from the menu. This will open the file explorer right to the location of your file.

2. Using the Files App

If the downloads bar disappears, you can find the file using the Chromebook’s file manager:

  1. Click the Launcher (the circle icon in the bottom-left corner).
  2. Type “Files” and open the Files app .
  3. In the left sidebar, click on “My files” or “Downloads”.
  4. Look for the file name you gave it (e.g., MyConvoTask.mp3) or look for a file saved right around the time you finished recording.

3. Uploading to the Task

Once you’ve located the file in your Downloads folder, you can go back to the Upload Version of the Conversation Task and use the Choose File button to select and submit it.

Setting Mic Permissions in Chrome

🎙️ Troubleshooting: Fixing Microphone Permissions in Chrome (Chromebooks)

If you’re having trouble recording your conversation task, Chrome may have blocked your microphone. Follow these simple steps to fix your site permissions.


Step 1: Access Site Permissions

  1. Click the lock icon (đź”’) located in the address bar, immediately to the left of the site URL.
  2. A small menu will appear with basic site permissions listed.

Step 2: Check and Change Microphone Setting

  1. Look for the “Microphone” listing in the small permissions menu.
  2. If it currently says “Block” â›”, click on it and change the setting to “Allow” âś….

Step 3: Use Detailed Site Settings (If Microphone Isn’t Listed)

If you do not see “Microphone” listed in the initial small menu:

  1. Click the “Site settings” link at the very bottom of that menu.
  2. This will open a new tab (chrome://settings/content/siteDetails?site=...).
  3. Scroll down to the Permissions section, find Microphone, and select Allow from the dropdown menu.

Step 4: Reload the Test Page

  1. Close the settings tab you just opened (if you used Step 3).
  2. Reload (refresh) your conversation test page.
  3. Chrome should usually prompt you again: “Allow innovationassessments.com to use your microphone?”
  4. Click Allow âś… to start recording.

No More Lost Work: Introducing Automatic Versioning and Essay Recovery

We’re excited to announce a significant upgrade to our writing assessment platform: a robust, automatic versioning system. This change is designed to eliminate the anxiety and frustration caused by unexpected browser crashes, internet connection drops, or power outages.

This system ensures that if your computer fails, you can recover nearly every word you’ve written.


1. How the System Works: Two Tiers of Protection

We’ve implemented a two-tiered saving system that balances constant crash protection with efficient storage of long-term history.

Tier 1: Crash Recovery (Every 50 Seconds)

Our primary safety net remains the quick, frequent autosave.

  • Function: Every 50 seconds, your application sends your work to our server.
  • Purpose: This aggressively updates the main copy of your essay on the server. If your browser crashes, the most you can lose is the work you completed in the last 50 seconds.
  • Where it’s Saved: This content is saved to the primary database record, which is what your instructor sees.

Tier 2: Historical Versioning (Every 5 Minutes)

This is the new feature that protects against data loss and provides a long-term audit trail.

  • Function: Every 5 minutes, a separate, dedicated process takes a full snapshot of your essay.
  • Purpose: This snapshot is saved as a complete, new record in a separate History Vault. If a crash happens, you now have a series of historical versions (from 5, 10, 15 minutes ago, etc.) that you can use to piece together your lost work.
  • Storage Efficiency: To avoid using up too much server space, the system is smart. It only keeps the last 5 versions for immediate recovery.

2. Using the New Essay Recovery Tool

If your browser or computer crashes, you’ll find a new recovery button when you reload the writing task. This process allows you to retrieve a specific version and append it to your current working document.

Step-by-Step Recovery:

  1. Relaunch the Task: Close your browser and reopen the writing assignment. The system will load the last successful Tier 1 save (up to 50 seconds before the crash).
  2. Click the “Recover Version” Button: A new modal window will open, showing a list of available historical snapshots.
  3. Review the Snapshots: The list will show the time and date of each of your last five saved versions (e.g., “Saved at 10:15 AM,” “Saved at 10:10 AM”).
  4. Select and Append: Click the “Append to Essay” button next to the version you want.
    • The system retrieves that entire version.
    • It automatically inserts the recovered text to the bottom of your current essay, clearly marked with a timestamp (e.g., --- Recovered version (10:10 AM) ---).
    • You can then quickly cut and paste the recovered sections back into the body of your essay.

Why Append?

We use the “append” feature to prevent accidentally overwriting good work. This method gives you full control, allowing you to manually review the recovered text and integrate it precisely where it was lost.


3. Best Practices for Test Takers

While this new system offers powerful protection, you still have a role to play in preventing crashes. The vast majority of work loss incidents are caused by an overwhelmed computer.

To ensure a smooth, crash-free experience, please follow these rules before starting your assessment:

  1. Close All Unnecessary Tabs: Close all social media, streaming video services (YouTube, Netflix), and other assignment tabs.
  2. Minimize Background Applications: Close file downloads, large games, or other applications that consume significant memory.
  3. Ensure a Stable Connection: Although the versioning system protects against brief internet drops, a stable connection ensures your 50-second autosaves are always successful.

This system is now live and ready to keep your writing safe! Happy writing! ✍️

Chromebook Crash? Your Test Answers Are Safe.

We know that a browser crash—especially a memory-related issue on a device like a Chromebook—is the last thing any student wants to deal with during an important test or assignment.

We’ve designed our online assessment system with a robust, multi-layered data saving architecture to make sure that a client-side crash results in minimal, if any, lost work. The result? At worst, you might lose only the last few seconds of work on the single question you were answering.

Here’s a look at how our system ensures maximum resilience against unexpected interruptions.


1. Answers Save Instantly (Per-Question Saving)

We don’t rely on a single “Save” button or a large, infrequent auto-save. Instead, we use a technique called asynchronous saving that pushes your answers to our secure server the moment you make a change.

  • Multiple-Choice & True/False: As soon as you click an option, that selection is immediately sent to the server and recorded. Data loss for these question types is virtually impossible unless the crash occurs mid-click.
  • Short Answer & Essay: When you are typing, your response is saved frequently and on critical events (like clicking out of the answer box). This means the work you’ve completed on previous questions is secure, and only the latest few seconds of typing on the current question might be lost if the crash happens before the final save request completes.

2. Time Used is Tracked and Recovered

For timed assessments, losing time due to a crash is just as frustrating as losing answers. Our system actively tracks and saves your elapsed test time to ensure a fair recovery.

  • Periodic Time Logging: A small, background function logs the total minutes you’ve used to the server at regular, short intervals.
  • Seamless Resumption: If your browser crashes, simply closing the crashed tab or restarting your device and logging back in will resume your test. The system immediately retrieves your last logged time, deducts it from your original limit, and starts the countdown from your remaining time. You will see a “Resuming” message before you continue.

3. What Isn’t Saved (And Why It Doesn’t Matter)

While core data like answers and time are continuously saved, some non-critical, client-side items might be lost:

  • “Mark for Review” Status: Any questions you had flagged for later review will need to be re-marked upon logging back in.
  • Window State: Settings like “full-screen mode” will revert to default and will need to be re-initiated.

These minor losses have no bearing on your score or the integrity of the test. Our priority is making sure the data that counts—your answers and your remaining time—is secure and waiting for you when you log back in.

Sunset

Well, the advent of AI in 2022 was both a boon and a bane for Innovation Assessments LLC. As a boon, it was an opportunity to incorporate AI into our apps. As a bane, it meant the main source of our income, sales on marketplaces like TeachersPayTeachers and TeachSimple, were reduced considerably. I closed both accounts last week.

The effort to make Innovation a “going concern” has spanned 30 years under different names. FrenchRegents.com. MultiLilnguae.com. JonesHistory.net. TeachersWebHost.com. The idea was to develop a subscriber base to my applications, software built to meet real teachers’ needs, not what programmers imagined teaching to be like. I never had the capital to launch properly in a crowded marketplace.

I am retired from teaching now. My career ended officially in 2023. I continue to teach remotely a few hours a day August to May because it is a pleasure to teach. My blog posts hence will come from that direction now.

If you want to subscribe to Innovation, you still can. But I am no longer engaging in marketing to promote it. Hey, I gave it a good go! Last spring I made a pitch to a virtual school I work for. Having not heard back from them, I surmise that my impressive Teams presentation was not enough to sell the service. It’s okay. They did me a kindness to hear my pitch.

I started coding classroom apps in 1993 when the closing Air Force base in Plattsburgh, NY donated a dozen or so IBM 286’s to the Crown Point Central School, where I was working. I learned QBasic and wrote simple programs for practicing verb conjugation for French (I was a French teacher then).

In the late 1990s, I was coding in C++ using a Borland compiler. I coded a messaging app, a drill and practice program, and even a security app for Windows 95! I was part-time technology coordinator for the school.

In the early 2000’s while active on a forum for foreign language teachers, I was contracted by a school on Long Island to code a web app for their classes in a “smart classroom”. From there I developed FrenchRegents.com and then SpanishRegents.com and even GermanRegents.com. I made some profit, until Barron’s Regents review made their test banks public, so I shuttered that business.

In 2007, I closed down and frankly foreswore trying to make money coding. It wasn’t until 2011 or ’12 that I tried again. By then, I had switched to teaching social studies (2004 I made the switch). JonesHistory.net was born.

The pandemic really supercharged my coding. I needed apps for teaching remotely. This was shortly after I moved from TeachersWebHost.com to InnovationAssessments.com after consults with a small business specialist. The limited liability company was formed in 2023.

After I retired from Schroon Lake Central School in 2023, I started teaching online part-time for LanguageBird and for Proximity Learning. These continue to be a source of enjoyment for me and an inspiration for app development. Both let me use my own enormous library of teaching materials spanning 30+ years.

A few weeks ago, I did some research and concluded that, if Innovation was to have a future, it needed to be re-coded in a new architecture. I worked for a short while in Python and then in Laravel Blade (a PHP implementation). Oy. The learning curve was steeper than I was able to tolerate. Add to this that I really didn’t have a reasonable expectation of any commercial success for my efforts, I abandoned the effort. It was kind of hard, actually. I have been working to build this, one way or another, for something like 30 years. It’s not easy to switch gears, admit defeat, let it go. But as a matter of principal, I value letting go when necessary.

So, I continue to upgrade the Perl scripts to suit my needs for teaching remotely. It’s a pleasure and it’s easier now with AI assistance. But even ChatGPT, in analyzing my old code, noticied the antiquity. Today it made a a joke about my coding being “like Netscape 1999”. I laughed out loud and I lamented a little that a robot could jest about outdated code.

Reader, rest assured that despite that “ancient” nature of the code, I have secured it against dangers and if you are a subscriber you may be assured of security.

Reader, a lot of this is tied up in being 57 years old and seeing things fade away and change. At this point, I suppose I am grappling with that.

AI at Innovation: Three Ways Our Tools Support Teachers

Artificial intelligence isn’t here to replace teachers—it’s here to make their work more efficient, insightful, and impactful. At Innovation Assessments, we’ve built AI into our platform in three carefully designed ways. Each of these functions addresses a different part of the teaching cycle: preparing lessons, evaluating student work, and monitoring learning behaviors in real time.

Let’s take a closer look.


1. Teaching Assistant: Generating Prompts, Tasks, and Test Questions

Teachers often spend countless hours preparing materials: prompts for writing, comprehension tasks, practice questions, or even entire quizzes. Our AI-powered Teaching Assistant helps cut that prep time by generating high-quality starting points:

  • Assessment & activity prompts: Suggests open-ended discussion questions, role-play scenarios, or practice drills tailored to your subject.
  • Test question generation: Builds multiple-choice or short-answer items aligned to your chosen level and category, whether it’s social studies DBQs, French language tasks, or science practice sets.
  • Adaptability: Because the generator accepts teacher input on topic, difficulty, and format, you still set the pedagogical direction—the AI just does the heavy lifting.

The result? More time to focus on pedagogy and less on busywork.


2. Grading Assistant: Scoring Short Answers and Longer Essays

Grading is where AI can provide meaningful support without ever removing teacher authority. Our Grading Assistant uses OpenAI’s models to analyze student responses and offer suggested scores or rubric-based comments:

  • Short answer scoring: Provides a confidence-scaled score (e.g., full credit, partial credit) with a rationale tied to your rubric.
  • Essay analysis: Surfaces structure, clarity, and argument strengths/weaknesses so you can give students faster, more targeted feedback.
  • Teacher control: Every score is a suggestion—teachers make the final call. AI never replaces professional judgment.

This approach reduces turnaround time and makes it easier to give richer feedback, even on assignments with dozens of responses.


3. Proctor Function: Analyzing Student Activity in Online Apps

Digital classrooms introduce new challenges: how do you know if students are fully engaged, struggling, or even drifting off task? Our Proctor Function gives teachers insight into behavior patterns during online interactions:

  • Session monitoring: Tracks student activity logs (e.g., navigation events, copy/paste, time away from page).
  • Pattern analysis: Uses AI to highlight irregularities—like frequent page exits during a quiz—or flag potential academic integrity concerns.
  • Formative insights: Goes beyond “cheating detection” by helping you spot disengagement, pacing issues, or moments when students may need extra support.

Think of it as a lens into classroom dynamics that’s hard to see in a virtual environment.


Why These Three?

We chose these categories—Teaching Assistant, Grading Assistant, Proctor—because together they cover the full arc of digital instruction:

  1. Before class (plan): generate engaging materials.
  2. After class (assess): provide consistent, fast feedback.
  3. During class (monitor): ensure students are active and supported.

Our guiding principle: AI should serve teachers, never the other way around.

Introducing Weighted Questions: The Smart Way to Grade Your Tests


Have you ever created a test where some questions were simply more important than others? Perhaps a single-sentence response question you intended as a quick check for understanding, and a more complex, multi-paragraph essay question that required a deeper analysis.

Or have you decided after the test that some questions need to be removed? Or correct the answer key and you need to recalculate scores?

With our latest update, you can now assign specific weights to each question on your tests, allowing you to create more nuanced and accurate assessments.

What’s New?

You now have the power to define the value of every question. When you’re editing your test, you can set the weight for each question, for example:

  • Multiple-choice questions worth 1 point.
  • Short answer questions worth 5 points.
  • An essay question worth 20 points.

Our new scoring system will automatically calculate the final score for each student based on the weights you set.

The Power of Re-Scoring

But what if you decide to change the weights after students have already taken the test? This is where the magic happens.

With the click of a single button, you can now re-score an entire class. Our new and improved algorithm will re-calculate every student’s grade, taking into account:

  • Updated Question Weights: If you change a question’s value, the scores will be instantly updated.
  • Answer Key Corrections: Did you find a mistake in the answer key? Correct it, and every student’s score will be recalculated.
  • Changes in Question Count: If you add or delete questions, the system will adjust the final score accordingly.

And it’s fast. This new re-scoring capability is built on a highly optimized system that can process hundreds of students and questions in a fraction of the time. We’ve even addressed edge cases, such as students who have had their old answers automatically archived, to ensure accurate and reliable results every time.

Our goal is to give you more flexibility and control over your assessments, so you can focus on teaching. Try out the new weighted questions feature today and see the difference.


NEW! AI Analysis of Proctor Notes (Student Engagement on Tests)

We’re excited to introduce a brand-new tool for teachers: AI Proctor Analysis. This feature takes the detailed proctoring logs collected during online assessments and automatically generates a professional, concise summary of student behavior—helping teachers spot issues faster and focus on teaching instead of sifting through logs.

How it Works

During an assessment, our system records digital behavior events such as page switches, text pasting, and other activity. These notes are stored securely in the teacher’s proctoring database.

With the new feature:

  1. Logs are gathered – For each student and test, the platform collects all behavior notes.
  2. Cleaned & organized – Duplicate or redundant entries are filtered so the report is readable.
  3. Analyzed by AI – The logs are sent through our secure AI integration. The AI is instructed to act as a strict test proctor, highlighting suspicious or irregular activity.
  4. Teacher summary – In just a few sentences, the AI generates a professional summary for the teacher, flagging potential problems and confirming if behavior was normal.

Why This Matters

  • Time-saving: No more scrolling through long behavior logs.
  • Professional tone: Reports are short, objective, and easy to share.
  • Enhanced oversight: Teachers get a clearer picture of digital test behavior at a glance.

Example

Instead of wading through dozens of raw log entries like:

Started task
Left page
Returned to test
Text pasted

The teacher sees a clear summary such as:

“The student briefly left the page twice and pasted text once. Behavior suggests potential use of outside resources. Recommend follow-up.”

Built with Security in Mind

  • Only authenticated teachers can access proctoring data.
  • Student activity logs are processed securely.
  • Every AI request is logged for accountability, including token usage and teacher identifiers.

The bottom line: With AI Proctor Analysis, you’ll spend less time interpreting logs and more time making informed decisions about your students’ online assessment behavior.


New Feature: Teacher Comments on Short Answer Questions

This week when giving pretests for AP French, a teacher found they needed the ability to leave notes directly on individual student responses. Today I rolled out a new feature that makes that possible.

What’s New

When reviewing a student’s short answer question, teachers will now see a “+ Comment” button alongside the scoring tools. Clicking this button opens a simple dialog where you can type in your feedback.

  • Comments are saved directly to the system, so they’re always there the next time you revisit the student’s work.
  • Each comment is linked to a specific student and specific question, so there’s no confusion about what the feedback refers to.
  • You can edit or delete your comment at any time.
  • An optional toggle lets you make the comment visible to the student, so it can be private teacher notes or shared feedback.

Why It Matters

This update is designed to give teachers more flexibility:

  • You can jot down quick reminders for yourself (“Check this student’s phrasing with the rubric later”).
  • You can leave direct feedback for students to help them improve.
  • You can build up a history of feedback that follows the student across sessions.

Keeping It Simple

I’ve worked to keep this feature as lightweight as possible:

  • Comments save instantly, no extra steps required.
  • The display is clean, with a simple box showing the comment, teacher name, and timestamp.
  • For students, if you’ve chosen to make a comment visible, they’ll see it in their results view — but private notes stay private.