I found an interesting YouTube channel for history (Dr Alan Brown), and one of his documentary videos says YES - the Weimar Republic might have been able to survive had the Great Depression not come along!
That many Germans believed they had not really lost World War I at all, but had been stabbed in the back by internal forces; socialists, liberals, Jews, the usual suspects...and then the Great Depression just destabilized the situation in Germany too much, the "straw that broke the camel's back," and vomited up Hitler and Nazism.
It is interesting, I did not know this, but the Allied forces did not enter or occupy Germany after WWI (except for one area I think it was, the Rhine countryside?). So German people never 'felt like' they lost that war...they felt betrayed, somehow. And then the Depression on top of all that, and Weimar went down.
Have you ever encountered this idea? Might it be true?