Jump to content

Beyond Quigley's Philosophy of Science to Universal Intelligence


jabowery

Recommended Posts

I decided to take a look at Carroll Quigley's "The Evolution of Civilizations" not because it is considered a prerequisite for discourse about civilization in these fora, but because, for some time, my own focus is on the abysmal state of the social sciences qua sciences.  The social sciences are so abysmally unscientific that it is a revolutionary act of genius for anyone to bring to bear anything remotely resembling scientific method.  Moreover, if one attempts to bring the social sciences into consilience with the larger body of human knowledge, one is attacked with religious fervor as evidenced by the treatment of E. O. Wilson by his Harvard colleagues in the 1970s over the nascent field of sociobiology.  So Quigley was a revolutionary genius -- not so much because he offered anything fundamentally new, but simply because he spoke of a few obvious truths about science in a field of virtually universal deceit: the social sciences of the 20th century.

 

Quigley's approach, however admirable given the horrid context, can too-easily lead one to accept premises of his which have subsequently shown themselves to be both scientifically inadequate and ethically vacuous.

 

First, and foremost, the ethical vacuity on display in "The Evolution of Civilizations" is shared by the entire field of the social sciences.  It may reasonably be summed up by comparing the ethics of medicine.  In medicine, even if one has conducted double blind controlled studies of the safety and efficacy of a treatment (ie: one has established strong evidence the treatment causes beneficial effects) -- even then it is considered unethical to apply the treatment to human subjects without their informed consent.  Accepting Quigley's proclamation that control experiments cannot be conducted in the social sciences to establish causality, the first duty of the ethical social scientist should be to denounce the use of his findings in a way that would violate the informed consent of human subjects in social engineering.  Let me re-emphasize in stronger terms:  Quigley, is not only not alone in this absence of ethics among social scientists, his posture is universally de rigueur.  

 

Nevertheless, those who hold Quigley up as an exemplar, however justified, have an ethical obligation to point out this ethical vacuity.

Secondly, Quigley, himself, describes the social science equivalent of statistical mechanics -- averaging large numbers to make predictions.  At the same time, he goes to great lengths in his discourse about "human nature" to emphasize that "culture" determines, for practical purposes, the outcome for statistically significant numbers of individuals.  This is, essentially, the Boasian dogma of 20th century anthropology.  It is upon this basis that we have seen the diagnosis of "institutional racism" held up as the "explanation" for statistical outcome differences between racial groups.  This, in turn, has expended many trillions of dollars in social engineering projects spanning over a half century with outcomes that are, at best, questionable and, in any event, violate the scientific ethics of informed consent when treating human subjects, as described above.

 

Having now made my essential critique of Quigley's otherwise reasonable premises, I want to point out what he got _very_ right in his presentation of scientific method, and how, with modern advances in universal intelligence based on mathematically defining Ockham's Razor in pursuit of automated science, we may be in a position to push beyond Quigley's limits.
 

Ray Solomonoff essentially proved Ockham's Razor as essential to science in terms of computer theory and did so at the dawn of the computer age. However, over a half century into the computer age, we still aren't even beginning to exploring those implications in a practical way. Here's an obvious implication that should have been pursued almost from the outset in the 1960s:

Whenever you have a dataset and are trying to come up with a predictive model, you have two basic options that avoid overfitting:

 

  1. Use the data you have, not to create the model but to test it.
  2. Approximate the data's Kolmogorov Complexity program as best you can so as to approximate Solomonoff Induction.

 

#1 invariably ends up being impractical since you can't _really_ construct a model out of first principles. In any event, as you start to "consume" your data in tests of your models, you end up refining your models which gets you into the land of post hoc theorization thence overfitting as you consume more data. The best you can do is what Enlightenment philosophers came up with: Experimental controls -- which is to say, you have experimental setups, all identical except being treated in a slightly different way (including no treatment called "the control").

 

The social sciences have become the modern equivalent of a theocracy given their impact on public policy -- but social scientists haven't reached the level of scientific ethics required for them to insist that their theories not be taken as justification for imposing experiments on massive human populations as is required by Federal arrogation of social policy from the Laboratory of the States. If social scientists had anything worthy of being called "ethics" they would insist on devolution of social policy to the States and Federal support of migration of people to the States whose social policies they find mutually agreeable. This directly addresses the scientific need for experimental variation as well as the ethical need for informed consent when dealing with human experimental subjects.

In the absence of such humility, the social sciences did have one other option:

 

Data compression to approximate Kolmogorov Complexity.

 

Note that I am not here talking about a general algorithm for data compression. I'm talking about a much simpler and obvious idea:

 

Comparing theories by how well those theories -- losslessly -- compress the same datasets.

 

And this is where I come to my perception of a "religious aversion" to Solomonoff Induction:

 

Whenever I see arguments against the utility of Solomonoff Induction in the aforementioned role -- comparing theories by the size of executable archives of the same datasets -- they are _invariably_ (in my experience) strawman polemics. Yes, Kolmogorov Complexity is incomputable -- but that's not the argument. We're not trying to come up with a program to compress the datasets! There is a difference between a program that compresses the datasets and a program that DEcompresses the datasets (the latter being the approximation of the KC program). This difference is so obvious that its conflation in these arguments -- its _predicatable_ conflation -- is reminiscent of Orwell's notion of "Crime Stop": Selective stupidity to avoid violations of Ingsoc or the official ideology of The Party. There are other, less obviously stupid, strawmen that arise from time to time but these are almost invariably in the category of philosophical attacks on Cartesianism or the scientific method itself. While it is fine to have those philosophical arguments, it seems rather silly to hold up practical application of Solomonoff Induction on that basis as virtually the entire structure of technological civilization is Cartesian.

Link to comment
Share on other sites

I honestly don't know the point you're making by the end of your post.

 

There is a lot of background knowledge required to get all the nuances of the issue but that is not necessary to get the general point:

 

Ockham's Razor, rightfully pointed to by Quigley as important in forming social theory, has for over half a century been formalized in the theory of automated science (Solomonoff induction) with specific application to data-rich areas that are not amenable to controlled experimentation.  This  knowledge has had zero impact on social science, a data rich field hobbled by the lack of controlled experimentation.

 

For instance, if the National Science Foundation set up a prize for the smallest executable archive of the General Social Survey database, it would ignite a firestorm of private sector statistical analysis to form the most comprehensive social model -- and no money would be spent by the government unless a superior social model were developed.

  • Upvote 1
Link to comment
Share on other sites

There is a lot of background knowledge required to get all the nuances of the issue but that is not necessary to get the general point:

 

So is this post a nuance of the issue or the general point?

 

 

Ockham's Razor, rightfully pointed to by Quigley as important in forming social theory, has for over half a century been formalized in the theory of automated science (Solomonoff induction) with specific application to data-rich areas that are not amenable to controlled experimentation.  This  knowledge has had zero impact on social science, a data rich field hobbled by the lack of controlled experimentation.

 

For instance, if the National Science Foundation set up a prize for the smallest executable archive of the General Social Survey database, it would ignite a firestorm of private sector statistical analysis to form the most comprehensive social model -- and no money would be spent by the government unless a superior social model were developed.

 

Sounds good. Still not seeing your point, though?

Link to comment
Share on other sites

So is this post a nuance of the issue or the general point?

 

 

 

Sounds good. Still not seeing your point, though?

The general point is that there is a practical path forward and beyond (but consistent with) Quigley in social science that has been available since the early 1960s based on advanced computation theory and, despite the enormous efforts put into computational "statistical analysis" over that period (for example SPSS) it has been not only neglected, but, in more recent years, actively resisted.

  • Upvote 1
Link to comment
Share on other sites

"Nevertheless, those who hold Quigley up as an exemplar, however justified, have an ethical obligation to point out this ethical vacuity."
That's an interesting notion.  Not convincing though.

" At the same time, he goes to great lengths in his discourse about "human nature" to emphasize that "culture" determines, for practical purposes, the outcome for statistically significant numbers of individuals"
Perhaps he was wrong in attributing culture to the failure of efforts in Africa rather than IQ.  And, I think you might be grouping what he identified specifically as the goldbrick nature of Africans as "culture", instead of just viewing it as, perhaps, an observable outcome of low IQ, which Quigley might not have known about, but, was, none the less, not wrong in associating as a factor instead of associating the underlying cause as the factor.  And, as for the Russians, or the Japanese, or the Chinese, he was describing his theory on why the societal system, not just the culture, reacted as they did to technology.  So, I am failing to see where you think he " to emphasize that "culture" determines, for practical purposes, the outcome for statistically significant numbers of individuals", or even, how this is necessarily wrong in all cases (or, even, from a reading of Quigley, how this should be taken as necessarily right for all cases).

"Comparing theories by how well those theories -- losslessly -- compress the same datasets."
I think this is a far step from any one's mind except yours.

To my knowledge, much of the datasets involved in 'social science' are so infinitesimally small relative to the significant factors which would contribute to a model to explain the datasets that such a model would not be predictive.  I suppose there are exceptions - for instance, with traffic, where a model might crudely correctly predict traffic (excepting, perhaps, events), despite factors influencing that traffic including aspects of the personal lives of the people driving.

Consequently, the concept of just making models which fully adhere to the existing data is something scientists, especially climatologists, seem to fail at.  Exploring the deficits of using compression size to compare models seems futile at this point.

Link to comment
Share on other sites

The general point is that there is a practical path forward and beyond (but consistent with) Quigley in social science that has been available since the early 1960s based on advanced computation theory and, despite the enormous efforts put into computational "statistical analysis" over that period (for example SPSS) it has been not only neglected, but, in more recent years, actively resisted.

 

I totally believe it. 

 

It blows my mind that Quigley was able to put his work together, especially his theory on the evolution of civilizations, pre-internet and largely without precedent. It's not like all the historians before him had the same scientific outlook towards history that he does.

 

It also blows my mind that nothing of his theory has been developed, or even fucking acknowledged in social "science" circles. That's why I tout it around as much as I can--I know it's not good enough, but it's the best and only starting point we have so far.

 

You haven't mentioned the biggest thing that I took from the book, yet: Quigley's ideas of an instrument, and institution, and how the former inevitably becomes the latter, and how this is a principle of historical forces. Particularly, that the essence of any civilization is its Instrument of Expansion. That is to say, how does the civilization create excess wealth, save it, then reinvest it into the civilization (a.k.a. capitalism)?

 

His entire description of civilizations revolves around how the instrument of expansion is or isn't functioning. The reason I say this is the most important principle, is that if we're going to resuscitate the West, or convert any other Civ to freedom, it's going to have to be done by accurately analyzing the Civ's instrument of expansion and correcting, fixing, or changing it. 

Link to comment
Share on other sites

It also blows my mind that nothing of his theory has been developed, or even fucking acknowledged in social "science" circles. That's why I tout it around as much as I can--I know it's not good enough, but it's the best and only starting point we have so far.

 

 

That's because most issues are multifactorial, not monofactorial. Statistical analysis is pretty advanced but not many social scientists can apply it properly. 

Link to comment
Share on other sites

...You haven't mentioned the biggest thing that I took from the book, yet: Quigley's ideas of an instrument, and institution, and how the former inevitably becomes the latter, and how this is a principle of historical forces. Particularly, that the essence of any civilization is its Instrument of Expansion. That is to say, how does the civilization create excess wealth, save it, then reinvest it into the civilization (a.k.a. capitalism)?

 

His entire description of civilizations revolves around how the instrument of expansion is or isn't functioning. The reason I say this is the most important principle, is that if we're going to resuscitate the West, or convert any other Civ to freedom, it's going to have to be done by accurately analyzing the Civ's instrument of expansion and correcting, fixing, or changing it. 

Well, I could go into a detailed review of Quigley's insights in TEoC but I agree with your positive assessment of his emphasis on instrumental organs "evolving" institutional sclerosis.  It is central to the cyclical rise and fall of civilizations and is really the point I was trying to make about rent seeking in another topic in these fora.  In other words, it is the no-brainer profits from instrumental organs that selects no-brainer rentiers taking over those organs causing institutional sclerosis.  Quigley's mode of explanation of this, the most critical aspect of his theoretic framework, is more kinematic than dynamic.  You really need dynamics if you want to fix things.

 

Link to comment
Share on other sites

Well, I could go into a detailed review of Quigley's insights in TEoC but I agree with your positive assessment of his emphasis on instrumental organs "evolving" institutional sclerosis.  It is central to the cyclical rise and fall of civilizations and is really the point I was trying to make about rent seeking in another topic in these fora.  In other words, it is the no-brainer profits from instrumental organs that selects no-brainer rentiers taking over those organs causing institutional sclerosis.  Quigley's mode of explanation of this, the most critical aspect of his theoretic framework, is more kinematic than dynamic.  You really need dynamics if you want to fix things.

 

 

Talk more about rents. I'm seeking rents myself. Are you speaking specifically or landlordism or just any venture that involves getting a check in the mail from somebody else using your space?

 

Because if the legal system in a society is built correctly, I would say a landlord has just as much of an obligation to provide safe housing and be liable for what happens to the property and how he treats the tenant. Landlord is in charge of insurance, taxes, repairs, and properly selecting a good tenant, and the tenant is responsible for paying his rent for these services.

Link to comment
Share on other sites

There is a lot of noise about "rent seeking" but the best way to think about it is in terms of privatizing positive externalities generated by others.  This is the mirror image of crony capitalists socializing their costs (as in the 2008 Wall Street bailouts).  In classical economics, these privatized positive externalities are called "economic rents" because they were originally associated with the portion of land value arising from such external investments as a military to defend the realm.  Such increases in land value are based on the rent the land owner can charge tenant farmers, for example.  

My contention is that Quigley's "institutionalization" arises as "authorities" (whether capitalist, religious, military or political) figure out ways of shifting the effective tax burden off of their privatized positive externalities and onto things like economic activity, or their socialized negative externalities.

Link to comment
Share on other sites

What exactly is the purpose behind Social Science? "Social Justice", "Playing Populous"?

 

Where does Trump factor into Social Science models? Instead of tracking statistics, is there a way to track individuals ahead of the curve. Or at least correlate individual Politicians or Tycoons etc, with long term national or company performance say over a 15 year period.

 

How does the rent seeking manifest itself in the USA, I mean you guys have millions, upon millions of acres of land.  Yes there is the stock market bailouts, but isn't that more outright theft or "Interventionism"(Central Planning). Not like the UK or perhaps ancient or modern Athens, with a restricted amount of land and natural resources. Or Japan, although they have a declining population.

 

Thinking about how a computer might need so much processing power, that even if various data(what do social scientists use?) could reliably be entered into a computer model it would be impractical to process? If a chess game is complex, how sophisticated can a social science model be and be of use to "someone". Perhaps abstract models could be useful as a guide. 

 

Perhaps if a smaller areas were used, societal models could be better interpreted, to make better and crucially, more equitable decisions. Maybe there are publicly available models linked with merchant retailers, like Ebay or Amazon somewhere? On a political level, how is an idea such as "spontaneous order" implemented on a local level. Thinking of Europe areas are pretty much screwed, if/probably when the last semblances of order breakdown, Muslim prison population etc. Could a handful of people really make a difference, if prepared and positioned opportunely.

Link to comment
Share on other sites


What exactly is the purpose behind Social Science? "Social Justice", "Playing Populous"?


The de facto role of so-called "social science" is that of theology in a theocracy:

 

Provide pseudo-scientific justification for the political agenda of those with whom the "social scientists" identify.

 

Think about your example:

 

 


Thinking of Europe areas are pretty much screwed, if/probably when the last semblances of order breakdown, Muslim prison population etc. Could a handful of people really make a difference, if prepared and positioned opportunely.


There are multiple powerful interests that want open borders.  They fund social scientists via public institutions.  Social scientists return the favor by providing "scientific papers" justifying open borders with high sounding arguments that amount to things like:

"If you increase diversity, things get better -- at least in the long run."  

This is basically what Putnam says in "Bowling Alone" which was one of the more honest attempts to measure the actual effects of "diversity" on societies -- and he found that things actually got worse for most people in the things that matter most.  So he refused to publish his findings until he could come up with a chapter about why it was all ok -- which is basically "at least in the long run".

If all the major databases that social scientists use to publish their papers were appended in one big file, and prizes were awarded for making ever smaller executable archives, it would totally nuke the social pseudo-scientists cum quasi-theologians.

This would be vastly more powerful in the good that it would do than 1000 Stefan videos.

Link to comment
Share on other sites

An academic note for those interested in formal justification of Ockham's Razor:

 

Quantitative justification for preferring simple theories is usually attributed to Ray Solomonoff's early 60's work on algorithmic probability based on Kolmogorov complexity.  However, it appears the first such justification appeared in Harold Jeffreys's early 30's book "Scientific Inference".  The main difference between Jeffreys's and Solomonoff's later work was the formal system used to encode theory.  Solomonoff used Turing machines -- making his work directly applicable to computation.  Jeffreys used Principia Mathematica's theory of types.  Interestingly, PM's theory of types has become the modern basis for computer science largely taking the place of Turing's "tape memory" machine.
 
Marcus Hutter's 2007 paper On Universal Prediction and Bayesian Confirmation does a wide survey of this history and favor's Solomonoff's treatment on technical grounds that I don't quite comprehend, but I tend to trust his judgement on these matters.  Be that as it may, the key insight that enabled Solomonoff's approach seems to be that rigorous formalization of theory (to account for _all_ observational data) yields a string of symbols and the length of that string can be used to numerically compare competing theories' probabilities of being correct.
Link to comment
Share on other sites

  • 1 month later...

I think I know what you mean. After I read Evolution of Civilisations I thought I was completely interested in anthropology, so I read a book on anthropology that wasn't interesting in the least. So then I realised that what I must really be interested in is sociology. So I bought two second hand textbooks on sociology from a university, I think they were published by Oxford University Press but I've since thrown them out. Absolutely worthless.

In the first book they bang on and on about Marx's theories, and in between Marx's theories they had statistics and trends of wage gaps, different wages between the sexes...

The second book painted a picture of what it's like to live in every city and every town in Australia. Neither book had any theories about how civilisations rise and fall, they both obsessed about how some people earn more than other people.

So I'm sitting there having read Evolution of Civilisations, and then reading the text books that university students are studying and it just blows my mind at how pointless those books are. 

Link to comment
Share on other sites

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.