EduTech from a Teacher-Coder: Engagement Without the Game

How to create meaningful, real-time engagement with a workflow that’s simple and actually usable in class

Whether you teach remotely like I do or are working in-person, you know that student engagement in the lesson is a paramount concern. It is important that students not be passive recipients very often or for long periods.

Gamification enthusiasts and coders who have not taught middle school seem to often believe that the answer is to make studying something more like an XBox adventure. Add music, competition, points and tokens and they will learn without even knowing it!

But I want my students’ cognitive load carrying the lesson, not the rules of the game or the points they earned or the banter with the other team. To this end, I developed “live session” interactive versions of many of the Innovation apps.

The workflow goes like this: the instructor starts a host instance of the activity, copies a special participation link and send it to students, who then get a screen for interacting. Live sessions turn the activity into an interactive activity that fosters engagement through inquiry, curiosity, discussion, debate, reinforcement.

I use the TestApp and the Étude live sessions to debrief after a test or to review for tests. the teacher screen displays the questions one at a time. The teacher host opens the session to responses and closes after the time. Student responses are displayed anonymously for debriefing.

“Engagement isn’t just activity—it’s thinking.” 

I use the Grammar app live session in my French classes. I can display the prompt to the screen, open for student responses, they then submit their work and I can display anonymously for debriefing. This is exactly the same as the assignment, just displayed in a different interactive form.

The Media powerpoint application I use most often for teaching social studies and for my advanced French courses where I am delivering content. This is a very powerful and flexible application that will be discussed in detail in a later post. Suffice it to say for now that the media live sessions have all the tools we need to get brief and extended student replies and reactions, from short answer to multiple-choice and even a selection of emojis!

One of my students remarked that the live sessions were kind of a boring Kahoot! I laughed and replied that was the intention! No points, music, sound effects, rankings, scores, goofy animations. The focus is on the lesson. If anything is to be entertaining, it’s going to be me!

EduTech from a Teacher-Coder: Restoring the Teacher’s Line of Sight

For about a decade, classroom technology quietly broke something important.

Teachers lost their line of sight into student work.

I don’t mean theory—I mean the simple ability to know what students are actually doing.

Some call this “command and control,” but that misses the point.

What teachers actually need is simple: the ability to know what students are doing, in real time, so they can guide and support them.

We need the old fashioned line-of-sight supervision and guidance that instructors maintained in effective classrooms in the ages before every student got a ChromeBook with a pile of office productivity software. I knew exactly what my students were doing as I circulated the classroom. I could look over their shoulder and contribute advice to a forming essay. I could redirect students who found something off-task more interesting to do. I could ensure with some reliability that no cheat sheets were being used on tests and that students were doing their own work. I was able to keep the class workflow moving so we didn’t fall behind with delays and procrastination.

Then came ChromeBooks whose screens we could not see or were easily hidden. With that came office productivity tools designed for mature adults in paying jobs who were motivated for the most part to get their work done. Ironically, tools designed for productive adults often made classrooms less productive.

There are a number of expensive software on the market now for monitoring student screens. At my last district, we had a product that let us monitor everyone’s tabs. But I really don’t think my own workflow is much improved by surveilling a dozen tiny screencasts.

I’m retired now and I teach remotely a few hours a day. I need more than ever to know know exactly what my students are doing. It is important to maintain the pace of the lesson and to ensure assessment integrity. This post’s “EduTech from a Teacher-Coder” is the monitors and proctors in all of Innovation’s apps.

Monitor

Every application at Innovation comes with a monitor to display in real time how students are progressing on their task. The test monitor shows what question students are on and even has a messaging feature so I can quietly post notifications to students in their test. The writing app monitor displays the current essay for each student, the number of words, their use of any AI licenses. Vocabulary quiz, sorting app, the “KnowWhere” map study, cause and effect study, reading comprehension, cloze app, ordered list, forum, even the AI chat application can display student progress and often their work product. The monitors all hide the student names as an option so that teachers can display the monitor on shared screen or in front of the classroom as a way to remind everyone to keep pace.

Monitors let teachers see the correct responses for many activities. The monitor returns an important feature of command and control of the classroom: I need to know exactly what they are doing.

Proctor

The proctoring feature is extensive throughout all of the activities. Proctor is an after-the-fact kind of analysis and proctors come with AI interpretation and summary features. When did they start the app? How long did they spend on each question? Did they leave the screen? Paste in any text? Try to right-click and use a spellchecker or AI assistant not licensed?

Common thoughts on giving assessments in remote teaching are that it is not reliable. But if there is a strong AI-assisted proctor running during the assessment and there is an adult supervising in the room, we can be assured of an assessment result as reliable as old fashioned in-person classes.

Teacher Command and Control Supports Successful Student Outcomes

When the proper guardrails are in place, guardrails we have always had in teaching, then we can be more assured of delivering the kind of high quality, effective training that leads to student success. A dozen applications at Innovation include monitoring and they all include proctoring.

For years, we handed students powerful tools…
and took away the teacher’s ability to see how they were being used.

That was the mistake and now we’re correcting it.

A Better Way to Assign Short Student Presentations Across the Curriculum

When I started teaching in 1991, the highest level of technology in my class was my pocket calculator. Supervision was a matter of circulating the room to make sure students were engaged.

When technology became part of our schoolrooms, we had to surrender a lot of the supervision that we used to have. Students could now hide behind ChromeBooks or click away quickly when we walk by and easily become off-task and disengaged. The main reason for this was that the first technology solutions were designed for offices, not for classrooms. We thought this was a great idea, since many students would one day in the workforce be using such applications.

We were wrong about that.

Software designed for adults, for office workers and designers, is not appropriate for most classroom settings simply because it does not have the guardrails and monitoring that we used to have in pre-EdTech days.

Yes, we worked around it. We added internet filters, screen monitoring software, and the like. But that is not the same as having direct observation of our students and control over their workflows.

Many efforts to create truly classroom-friendly EdTech have focused on “gamifying” learning. Developers believed in the old trope that you could trick them into learning if they were having fun. Don’t get me started on that…

The problem I wanted to address in this post occurred in a remote AP French class I was teaching. The remote platform was Canvas. The assignment was to produce a 2-minute video presentation in French, mostly improvised, to model how the task was set up on the AP exam. The students dutifully uploaded their little videos to Canvas and it was obvious that they were reading prepared scripts and they had either an AI either do the work or correct the work. I knew from class sessions that they were not capable of that level of language proficiency and anyone watching could see they were reading.

How does one rationalize giving a high stakes grade for that?

EduTech Solution from a Teacher-Coder

Presto is an application at the Innovation platform that resolves the issue of students having AI-generated presentations and scripts without real learning or synthesis. While originally devised as an evaluation tool for world language learners, it is extremely effective in content area classes like social studies.

Students log in and are redirected to the assessment. After setting their camera and mic and starting the camera, the task begins. Only now can they see the prompts. There is a strict timer and an AI-enhanced proctor records their engagement and activity on the page. There is a time limit. Once started, they need to finish or they must be readmitted by the teacher. This prevents viewing the prompts and then starting again after research.

The proctor provides the supervision we often lack in modern education software. The time limit and the coordination of camera activation with prompt visibility prevent cheating very effectively.

“AI has made scripted assignments meaningless. Presto measures thinking instead.”

More importantly, the structure encourages authentic thinking. Students must interpret the prompts and organize their ideas in real time rather than relying on pre-written scripts. Instead of reading polished AI-generated text, they must explain ideas in their own words within a clear time limit.

For teachers, this makes evaluation more meaningful: the focus shifts from detecting AI assistance to assessing a student’s ability to communicate understanding.

Students must interpret the prompts and organize their ideas in real time. Instead of reading polished AI-generated text, they explain ideas in their own words within a clear time limit.

For teachers, this changes the evaluation process completely. Instead of trying to determine whether a script was written by the student or by an AI assistant, we can focus on what actually matters: a student’s ability to communicate understanding.

And that was the goal all along.

The Classroom Is Not a Game (and Not an Office Either)

Though retired, I still teach a few courses a day remotely. This week, I attended a professional development meeting for one of the companies for whom I teach, where a presenter used a popular interactive presenting app. The presentation itself was excellent. The app, however, was another matter entirely.

I will grant that, as a developer of educational technology myself, I am a harsh critic. But I suspect even the hundred and fifty or so others on that Zoom call would agree. The app was heavily gamified, filled with sound effects and floating reaction emojis designed to promote “engagement.” Each emoji triggered a popping bubble sound as it drifted across the screen. Participants continued clicking them even after being asked to stop, while the presenter was attempting to explain how to construct a complex AI prompt. The result was not engagement, but distraction.

My earlier posts have noted my long-standing skepticism of gamification. Its promoters often cling to the old trope that if students are having fun, they will not even realize they are learning. Forgive me for sounding like the old fogey that I am, but that idea has always struck me as pedagogically misguided. I want students to know they are learning. More importantly, I want them to learn how to guide and regulate their own learning. Attention should be directed toward the material, not toward points, sounds, or game mechanics.

If you explore the Innovation platform, you will notice that it is intentionally plain. Interactive tools include emoji responses, but they are subtle, silent, and easily disabled. This is by design. The platform reflects how I actually teach, rather than how a game designer imagines learning should feel.

Because most teachers are not developers, we often adapt software that was never designed for classrooms in the first place. We rely on office productivity tools or on educational software built by developer teams whose instincts lean more toward gaming than pedagogy. I occupy an unusual position as both teacher and developer, and I find great satisfaction in coding applications that behave the way a teacher actually needs them to behave.

The Classroom is Not the Office

Having taught since 1991, I have lived through the entire technological transformation of education. My first classroom had chalkboards and binders. My last, before retiring three years ago, had 1:1 student laptops and a SmartBoard. One persistent problem has been that much of our classroom software originated outside education, particularly in office environments.

When we placed laptops running word processors and spreadsheets in front of students, we gained powerful tools but lost a degree of visibility and supervision. In 1991, it was nearly impossible for a student to hide off-task behavior behind a notebook. In 2026, it may be a hidden browser tab. What was marketed as “real-world experience” often came at the cost of instructional control.

At Innovation, I aim to design learning spaces that originate in education rather than being imported from the office or the gaming world. Our writing tools include optional AI proctoring and live monitoring so instructors can observe student work in progress. Our assessment tools provide similar oversight, along with messaging features that allow teachers to guide, redirect, or support students in real time.

In short, the goal is not to make learning noisier or more entertaining. It is to make it more focused, more observable, and more teachable.

Good educational technology should not compete with the lesson for attention. It should support the teacher, clarify the task, and fade quietly into the background of learning.

After more than three decades in the classroom, I have come to believe that the best tools are not the loudest or the most entertaining, but the ones that respect how learning actually happens: through focus, guidance, and sustained attention. If our software cannot preserve those conditions, then no amount of animation, gamification, or sound effects will make up for what is lost.

Innovation 2.0

The few who read this may have seen the post a while back called “Sunset“in which I reflected on the difficulties and, well, failures I suppose of trying to develop an LMS as a small business without a huge bankroll for a coding team and marketing. In 2007 when I started this and made some money from my inventions, the internet was very different.

So then AI came along. There is plenty of material for blog posts on how this transforms my teaching (I still teach remotely part-time). The big effort for me was trying to devise ways to prevent or at least make difficult the inappropriate use of AI by my students. Interestingly, I turned to AI to do this.

Like my colleagues who did not just surrender to AI student work submissions, I first worked on changing how I designed my assignments. That only goes so far.

Next I rolled up my sleeves and started tweaking my own code in this platform which I use for teaching remotely. Things like timers, detailed logging and response of student activity in a browser, hiding things until time has passed, and eventually on to getting an API key from OpenAI so that I could add a button that would analyze the logged data from student interactions on the platform and understand likelihood of inappropriate usage.

Once I started tweaking my old code, I noticed increasingly that the AI I was using to correct it, making enhancement above my coding ability, was itself increasingly having trouble with old-fashioned and out of date coding practices in the Perl language. I asked it about this. It explained that the code base I had (which is admittedly 20+ years old) was out of date such that it would not support a moderate customer base. The database itself, holding the work of myself and customers some going back twenty years, had obsolete features beyond the scope of this post to explain. The work to re-code and update this was enormous and overwhelming. That’s when the “Sunset” blog was written.

But then I had a cool idea for an application. I needed a way to let my AP French students practice and be evaluated asynchronously for conversation skills. I wanted to write this in a modern way using up-to-date code base. I used AI to write it. I was not as proficient in PHP as I was in Perl. I was tired of coding and wanted to focus on curriculum development.

The result was smashing! And from there I kept building… Three months later, I have nearly completed Innovation 2.0. Wow. I have moved from coding myself to directing the AI to to the detailed coding. I am now the creative director, no longer consulting programming language manuals or searching stackoverflow.

What’s especially exciting for me is that the new software works exactly as I wish it to. And it’s all in one place! That was why I started coding 30 years ago anyway! I like to build and invent.

So in January I will be using innovation 2.0 with my own students to refine and debug it and then move customers over in February and start offering this platform publicly. There are great new apps I can offer, a fully-integrated AI support system with guardrails and controls, effective live monitoring and more!

🎧 Recording for Upload (Using Chromebook or Chrome Browser)

If you can’t use the direct browser recording feature, you can record your conversation using your Chromebook’s built-in tools (like Voice Recorder or Memos) or a free online recording tool, and then upload the file.


Part 1: Record Your Audio File

  1. Open Your Recording Tool:
    • Chromebook App: Open the Voice Recorder app or Memos app on your Chromebook.
    • Online Tool: Open a new browser tab and navigate to a simple, free online voice recorder (e.g., searching “online voice recorder” will give several options).
  2. Start Recording: When you are ready, start the timer or recording app and give your responses to the prompts in the conversation task.
  3. Stop and Save: Stop the recording once complete. Look for a Save, Download, or Export button.
  4. Name the File: If the tool prompts you, give your file a clear name (e.g., MyConvoTask.mp3). This makes it much easier to find later!

Part 2: Locating the Saved File (The Hard Part!)

Once you click Save or Download, the file goes to your local storage, most commonly the Downloads folder.

1. Using the Chrome Downloads Bar

  • After the file saves, you should see a small bar at the bottom of your Chrome browser window showing the file name.
  • Click the small arrow (⯆) next to the file name.
  • Select “Show in folder” from the menu. This will open the file explorer right to the location of your file.

2. Using the Files App

If the downloads bar disappears, you can find the file using the Chromebook’s file manager:

  1. Click the Launcher (the circle icon in the bottom-left corner).
  2. Type “Files” and open the Files app .
  3. In the left sidebar, click on “My files” or “Downloads”.
  4. Look for the file name you gave it (e.g., MyConvoTask.mp3) or look for a file saved right around the time you finished recording.

3. Uploading to the Task

Once you’ve located the file in your Downloads folder, you can go back to the Upload Version of the Conversation Task and use the Choose File button to select and submit it.

Setting Mic Permissions in Chrome

🎙️ Troubleshooting: Fixing Microphone Permissions in Chrome (Chromebooks)

If you’re having trouble recording your conversation task, Chrome may have blocked your microphone. Follow these simple steps to fix your site permissions.


Step 1: Access Site Permissions

  1. Click the lock icon (🔒) located in the address bar, immediately to the left of the site URL.
  2. A small menu will appear with basic site permissions listed.

Step 2: Check and Change Microphone Setting

  1. Look for the “Microphone” listing in the small permissions menu.
  2. If it currently says “Block” ⛔, click on it and change the setting to “Allow” ✅.

Step 3: Use Detailed Site Settings (If Microphone Isn’t Listed)

If you do not see “Microphone” listed in the initial small menu:

  1. Click the “Site settings” link at the very bottom of that menu.
  2. This will open a new tab (chrome://settings/content/siteDetails?site=...).
  3. Scroll down to the Permissions section, find Microphone, and select Allow from the dropdown menu.

Step 4: Reload the Test Page

  1. Close the settings tab you just opened (if you used Step 3).
  2. Reload (refresh) your conversation test page.
  3. Chrome should usually prompt you again: “Allow innovationassessments.com to use your microphone?”
  4. Click Allow ✅ to start recording.

No More Lost Work: Introducing Automatic Versioning and Essay Recovery

We’re excited to announce a significant upgrade to our writing assessment platform: a robust, automatic versioning system. This change is designed to eliminate the anxiety and frustration caused by unexpected browser crashes, internet connection drops, or power outages.

This system ensures that if your computer fails, you can recover nearly every word you’ve written.


1. How the System Works: Two Tiers of Protection

We’ve implemented a two-tiered saving system that balances constant crash protection with efficient storage of long-term history.

Tier 1: Crash Recovery (Every 50 Seconds)

Our primary safety net remains the quick, frequent autosave.

  • Function: Every 50 seconds, your application sends your work to our server.
  • Purpose: This aggressively updates the main copy of your essay on the server. If your browser crashes, the most you can lose is the work you completed in the last 50 seconds.
  • Where it’s Saved: This content is saved to the primary database record, which is what your instructor sees.

Tier 2: Historical Versioning (Every 5 Minutes)

This is the new feature that protects against data loss and provides a long-term audit trail.

  • Function: Every 5 minutes, a separate, dedicated process takes a full snapshot of your essay.
  • Purpose: This snapshot is saved as a complete, new record in a separate History Vault. If a crash happens, you now have a series of historical versions (from 5, 10, 15 minutes ago, etc.) that you can use to piece together your lost work.
  • Storage Efficiency: To avoid using up too much server space, the system is smart. It only keeps the last 5 versions for immediate recovery.

2. Using the New Essay Recovery Tool

If your browser or computer crashes, you’ll find a new recovery button when you reload the writing task. This process allows you to retrieve a specific version and append it to your current working document.

Step-by-Step Recovery:

  1. Relaunch the Task: Close your browser and reopen the writing assignment. The system will load the last successful Tier 1 save (up to 50 seconds before the crash).
  2. Click the “Recover Version” Button: A new modal window will open, showing a list of available historical snapshots.
  3. Review the Snapshots: The list will show the time and date of each of your last five saved versions (e.g., “Saved at 10:15 AM,” “Saved at 10:10 AM”).
  4. Select and Append: Click the “Append to Essay” button next to the version you want.
    • The system retrieves that entire version.
    • It automatically inserts the recovered text to the bottom of your current essay, clearly marked with a timestamp (e.g., --- Recovered version (10:10 AM) ---).
    • You can then quickly cut and paste the recovered sections back into the body of your essay.

Why Append?

We use the “append” feature to prevent accidentally overwriting good work. This method gives you full control, allowing you to manually review the recovered text and integrate it precisely where it was lost.


3. Best Practices for Test Takers

While this new system offers powerful protection, you still have a role to play in preventing crashes. The vast majority of work loss incidents are caused by an overwhelmed computer.

To ensure a smooth, crash-free experience, please follow these rules before starting your assessment:

  1. Close All Unnecessary Tabs: Close all social media, streaming video services (YouTube, Netflix), and other assignment tabs.
  2. Minimize Background Applications: Close file downloads, large games, or other applications that consume significant memory.
  3. Ensure a Stable Connection: Although the versioning system protects against brief internet drops, a stable connection ensures your 50-second autosaves are always successful.

This system is now live and ready to keep your writing safe! Happy writing! ✍️

Chromebook Crash? Your Test Answers Are Safe.

We know that a browser crash—especially a memory-related issue on a device like a Chromebook—is the last thing any student wants to deal with during an important test or assignment.

We’ve designed our online assessment system with a robust, multi-layered data saving architecture to make sure that a client-side crash results in minimal, if any, lost work. The result? At worst, you might lose only the last few seconds of work on the single question you were answering.

Here’s a look at how our system ensures maximum resilience against unexpected interruptions.


1. Answers Save Instantly (Per-Question Saving)

We don’t rely on a single “Save” button or a large, infrequent auto-save. Instead, we use a technique called asynchronous saving that pushes your answers to our secure server the moment you make a change.

  • Multiple-Choice & True/False: As soon as you click an option, that selection is immediately sent to the server and recorded. Data loss for these question types is virtually impossible unless the crash occurs mid-click.
  • Short Answer & Essay: When you are typing, your response is saved frequently and on critical events (like clicking out of the answer box). This means the work you’ve completed on previous questions is secure, and only the latest few seconds of typing on the current question might be lost if the crash happens before the final save request completes.

2. Time Used is Tracked and Recovered

For timed assessments, losing time due to a crash is just as frustrating as losing answers. Our system actively tracks and saves your elapsed test time to ensure a fair recovery.

  • Periodic Time Logging: A small, background function logs the total minutes you’ve used to the server at regular, short intervals.
  • Seamless Resumption: If your browser crashes, simply closing the crashed tab or restarting your device and logging back in will resume your test. The system immediately retrieves your last logged time, deducts it from your original limit, and starts the countdown from your remaining time. You will see a “Resuming” message before you continue.

3. What Isn’t Saved (And Why It Doesn’t Matter)

While core data like answers and time are continuously saved, some non-critical, client-side items might be lost:

  • “Mark for Review” Status: Any questions you had flagged for later review will need to be re-marked upon logging back in.
  • Window State: Settings like “full-screen mode” will revert to default and will need to be re-initiated.

These minor losses have no bearing on your score or the integrity of the test. Our priority is making sure the data that counts—your answers and your remaining time—is secure and waiting for you when you log back in.

Sunset

Well, the advent of AI in 2022 was both a boon and a bane for Innovation Assessments LLC. As a boon, it was an opportunity to incorporate AI into our apps. As a bane, it meant the main source of our income, sales on marketplaces like TeachersPayTeachers and TeachSimple, were reduced considerably. I closed both accounts last week.

The effort to make Innovation a “going concern” has spanned 30 years under different names. FrenchRegents.com. MultiLilnguae.com. JonesHistory.net. TeachersWebHost.com. The idea was to develop a subscriber base to my applications, software built to meet real teachers’ needs, not what programmers imagined teaching to be like. I never had the capital to launch properly in a crowded marketplace.

I am retired from teaching now. My career ended officially in 2023. I continue to teach remotely a few hours a day August to May because it is a pleasure to teach. My blog posts hence will come from that direction now.

If you want to subscribe to Innovation, you still can. But I am no longer engaging in marketing to promote it. Hey, I gave it a good go! Last spring I made a pitch to a virtual school I work for. Having not heard back from them, I surmise that my impressive Teams presentation was not enough to sell the service. It’s okay. They did me a kindness to hear my pitch.

I started coding classroom apps in 1993 when the closing Air Force base in Plattsburgh, NY donated a dozen or so IBM 286’s to the Crown Point Central School, where I was working. I learned QBasic and wrote simple programs for practicing verb conjugation for French (I was a French teacher then).

In the late 1990s, I was coding in C++ using a Borland compiler. I coded a messaging app, a drill and practice program, and even a security app for Windows 95! I was part-time technology coordinator for the school.

In the early 2000’s while active on a forum for foreign language teachers, I was contracted by a school on Long Island to code a web app for their classes in a “smart classroom”. From there I developed FrenchRegents.com and then SpanishRegents.com and even GermanRegents.com. I made some profit, until Barron’s Regents review made their test banks public, so I shuttered that business.

In 2007, I closed down and frankly foreswore trying to make money coding. It wasn’t until 2011 or ’12 that I tried again. By then, I had switched to teaching social studies (2004 I made the switch). JonesHistory.net was born.

The pandemic really supercharged my coding. I needed apps for teaching remotely. This was shortly after I moved from TeachersWebHost.com to InnovationAssessments.com after consults with a small business specialist. The limited liability company was formed in 2023.

After I retired from Schroon Lake Central School in 2023, I started teaching online part-time for LanguageBird and for Proximity Learning. These continue to be a source of enjoyment for me and an inspiration for app development. Both let me use my own enormous library of teaching materials spanning 30+ years.

A few weeks ago, I did some research and concluded that, if Innovation was to have a future, it needed to be re-coded in a new architecture. I worked for a short while in Python and then in Laravel Blade (a PHP implementation). Oy. The learning curve was steeper than I was able to tolerate. Add to this that I really didn’t have a reasonable expectation of any commercial success for my efforts, I abandoned the effort. It was kind of hard, actually. I have been working to build this, one way or another, for something like 30 years. It’s not easy to switch gears, admit defeat, let it go. But as a matter of principal, I value letting go when necessary.

So, I continue to upgrade the Perl scripts to suit my needs for teaching remotely. It’s a pleasure and it’s easier now with AI assistance. But even ChatGPT, in analyzing my old code, noticied the antiquity. Today it made a a joke about my coding being “like Netscape 1999”. I laughed out loud and I lamented a little that a robot could jest about outdated code.

Reader, rest assured that despite that “ancient” nature of the code, I have secured it against dangers and if you are a subscriber you may be assured of security.

Reader, a lot of this is tied up in being 57 years old and seeing things fade away and change. At this point, I suppose I am grappling with that.