Dear Rooster,
Well I have a lot to say about your question, but still and all I don't really have an answer...one of the definitions of the word POLITICS is: "the activities of governments concerning the political relations between countries," and in that sense war can be seen as bad job of political relations, I guess...
However, it still requires both countries to do a good job of political relations; thinking of Neville Chamberlain and peace in our time, to justify English appeasement of Hitler...and we all know where THAT ended up...after years under the humiliation and deprivation of Versailles, Germany was bound and determined for revanchism.
Also, Carl von Clausewitz lived 1780-1831, before WWI. I have just been reading that WWI changed war forever; nothing before that compared, in that WWI could never be won in any traditional sense of the word. That WWI was more just endurance; the side that could just keep sending millions of soldiers to their death, along with throwing endless matériel at the enemy...you could not win, just outlast the other side in horrible trench warfare and chemical warfare...
* * *
So again, idk. Do you have an opinion on that?