Disperse (a little) the repetition dates please

I am using very steadily (streak=108), relatively intensely Clozemaster at the moment, targeting ca. 200 repetitions a day and I have a score nearing 500,000.

I am struck how poorly distributed/smoothed the daily workload remains over time.

Looking at my schedule in coming days, I have days with 16 reps, days with 276, a crazy difference of 17x.

As a user of Supermemo, I am never faced with that issue because Supermemo slightly randomizes the repetition spread. It would not be 90 days or 180 days, but 90 plus a random number within a range of +/- 5% of 90, 180 +/- 5%.

This has obviously no impact on the memorization efficacy.

After a few iterations, the result is that you do not get crazy, daily workload clusters like in Clozemaster currently: everything gets randomly smoothed out after a while.

I believe that it is programmatically trivial to implement and would greatly increase the comfort of using the app for committed users.

Many thanks for your consideration.

5 Likes

Here is the graphic illustration of my point, using the March repetition schedule of my drills in Clozemaster and Supermemo:

Clozemaster has an easy win here to close a qualitative gap with Supermemo and improve user experience.

4 Likes

I’ve been thinking in the same way, that some randomization would smoothe things out in the long run.

Below is the review forecast for one of my language pairings:

Screenshot_20230302-101655~2

I wonder about this peak I will be seeing shortly; my guess is that it’s my punishment for having been extra diligent on one day a year back. :wink:

5 Likes

What are your review settings under both “Review Intervals” and “For sentences answered correctly that are already 100% Mastered”?

If your review intervals are multiples of each other (like 10 and 30), or even if they have some divisors in common (like 20 and 30), I believe that there will be a tendency for them to start clustering on certain days. It so happens that the review intervals chosen by default are multiples of each other (perhaps 0, 1, 10, 30, and 180 days), so this is common.

I believe that the intervals chosen by “Hard/Normal/Easy” for sentences that are already 100% mastered are always either half or an integral number of times the interval for 100% mastered. I don’t think this the interface currently allows us to change this, however.

For one of my languages, I long ago changed the settings to 1, 5, 11, 23, and 365 days, with an always increasing next review interval for Hard/Normal/Easy buttons. I do get pile-ups, perhaps partly due to the fact that 5 goes into 365 evenly, but more likely because of the fact that the intervals for Hard/Normal/Easy are multiples of 365. Today I changed the 5 to 4 so that none of the intervals would have any factors in common, but I still expect to see pile-ups.

Randomization would probably do a better job of smoothing out peaks, but at least setting the review intervals is under our control.

@mike, do you have any comment as to the feasibility of the kind of dispersion that @Anxos is mentioning? It would be a little harder to explain in the interface, but hopefully not too much harder.

3 Likes

Here is how it is implemented in the Obsidian SRS, directly inspired from Anki (and Supermemo):

User documentation, where describing intervals, just has to indicate “+/-5% to avoid repetition clustering”.

Easy win.

4 Likes

Agreed! Work in progress, aiming to get this added within the next few months if not sooner.

5 Likes

I’m still waiting eagerly for this feature.

2 Likes

In @mike we trust.

With the patience of angels.

3 Likes

Fuzz is coming, but that will only help with reviews moving forward. There will still be the issues of 1. if you already have a huge number of reviews backlogged and 2. there are already a bunch queued up for the same day like in the graph @morbrorper posted above.

  1. might be out of scope for this thread - max reviews per collection per day seems like it might be the best option, but we’re open to ideas/discussion. Also not sure how that’d work with review forecasting just yet.

For 2., perhaps some way to redistribute reviews? A form like “Redistribute reviews evenly over the next N days” or “Redistribute reviews with a max of N per day” that you could submit whenever you get backed up or you’re coming back after a break? Open to ideas here too, we’re still thinking it through.

Will let you know once fuzz is up.