This page addresses the question of duplicate content, but does not describe what I'm facing. After first encountering the Google "canonical algorithm bug" 2 months ago, I've made only a little progress. I was able to get Google to stop selecting old pages as canonical for important new pages. Unfortunately, now the issue is Google can't tell the new pages apart. Here are 2 examples (replace example with signalogic):
Inspecting codec_samples in GSC shows directcore as "Google-selected canonical":
Of course this blocks the codec_samples page, which used to bring in 50+ clicks a day. Now I cannot get the page indexed (have verified that with a site search).
According to webconfs "similar-page-checker", these pages are 5% similar. Obviously they look different. Neither links to the other. What else is Google paying attention to ? What can I do to get Google to see the "codec_samples" page as unique ?
Note about GSC URL parameters settings: I have it set to index "Every URL" and the subcategory to "Specifies".