glaurung: (Default)
Someone brought "They Don’t Read Very Well: A Study of the Reading Comprehension Skills of English Majors at Two Midwestern Universities" by Susan Carlson, Ananda Jayawardhana, Diane Miniel (CEA Critic, Volume 86, Number 1, March 2024) to my attention today. a pdf of the paper is here

This is an appallingly badly designed study and a seriously flawed paper and the authors should be fucking ashamed of themselves. (Carlson is an English professor, Miniel was one when the research was conducted in the teens, and Jayawardhana is a statistician, all at Pittsburgh State U. Why they ventured all the way to Kansas to conduct their research is a mystery).

They interviewed 85 students majoring in English or English education (mostly juniors and seniors with some sophomores and four freshmen) at two Kansas universities. Each participant took a reading test designed to determine 10th grade literacy, filled out a survey, and then were asked to spend 20 minutes reading the opening seven paragraphs of Dickens' Bleak House aloud, pausing to explain the meaning of each sentence as they went along. They were given dictionaries and allowed to look things up in those or on their phones. They were told that it didn't matter if they did not finish going through the passage before time was up. (I'll put the opening paragraphs they used in the first comment below so you don't have to google it on project Gutenberg)

The authors recorded the students reading aloud and explaining the passage to an interviewer, transcribed those recordings, and then tagged and analyzed the transcripts. Their conclusions were that only five percent of the students were able to properly understand the dickens passage ("proficient readers") another 38 percent understood about half the passage ("competent readers"), and 58 percent struggled to understand the passage ("problematic readers"). "Problematic readers often described their reading process as skimming and/or relying on SparkNotes" (page 6).

They conclude that most of the people majoring in English do not have the reading skills necessary for such a major and do not gain those reading skills from their first and second year classes.

This paper has two huge and one large problems.

1. I am probably in or near their top 5% of readers, and I would have struggled and been incredibly frustrated if someone asked me to read the beginning of a Dickens novel one sentence at a time, explaining each sentence as I go along. That's not how novels are intended to work, especially densely written 19th century novels. Reading a whole paragraph, or the whole passage, then going back and working through it bit by bit, sure. But expecting to extract meaning from each sentence in isolation without knowing what's coming in the next sentence -- no fucking way. Fictional prose is intended to *flow*, you can't ask someone to chop it up into bits based on the punctuation and expect those bits to make sense without the context of what comes next.

2. They assigned 21st century American students a passage from a mid 19th century British novel, thereby turning it from a test of whether or not they could read and understand a chunk of complex literary prose, into a test of that *plus* whether or not they had adequate working knowledge of an archaic and foreign prose style, culture, vocabulary, and setting. OF COURSE the students struggled and did poorly in figuring out the meaning of "Michaelmas term," "the Lord Chancellor," and "Lincoln’s Inn Hall" in just the first sentence. Not because they're poor readers, but because they're not 19th century Londoners.

If the researchers had been serious about trying to gauge the students' reading ability without confounding their results with the student's poor familiarity with 170 year old prose style, setting, and culture, they could have assigned them a dense passage from a 20th or 21st century American literary novel. But they didn't. This is the worst kind of bigoted, classist, prior knowledge based intelligence testing. Shame on the authors, and shame on the journal for publishing this crap.

The not so huge but still large problem:
3. Forty-one percent of the study's participants were "English education" majors, not traditional English majors. Even if both majors are taught by the English department, they are very different beasts with very different course requirements that attract very different types of student. One is a major for people who wish to become primary and secondary school teachers of English (aka grammar, literacy, writing, and maybe also age-appropriate novels), the other is for people who enjoy reading and analyzing literature. The authors do not say anything about how the two majors differed in their ability to understand the opening of Bleak House, an omission that makes me raise my eyebrows very high.

In sum: Those who can't, teach. Those who can't do research, write poorly conceived papers on how their students are bad at reading.

***

That said, I am unsurprised that significant numbers of students struggled with the passage. Even aside from the design flaws which artificially lowered the scores of the students in the study, some people who aren't actually interested in reading and thinking about novels get a degree in English as a job credential (like those education majors who were included in the study, who may only be in the English department because it's a path to getting a teaching certificate that doesn't involve math).

And it is a sad fact that secondary schools turn out tons of students each year who have never really learned to read well, some of them with high GPAs. Now that a 4 year degree is required for many jobs that used to be open to high school graduates, they end up taking classes they're not really equipped for.

Which reminds me of something my sister said about her year teaching English at a magnet school in Austin Texas. She was one of the only teachers who stocked her classroom with age appropriate books and encouraged her students to pick out and read ones that appealed to them. The administration and most of the rest of the teachers regarded this sort of thing as a bad idea, because students reading books for fun were not spending time honing the skills needed for doing well on the next standardized test.

To the extent that it's a real problem, and not one manufactured to produce a shocking academic article, it's multi generational at this point. Those who don't read books themselves, have become school teachers and are teaching children to read but not to be able to read very well. Those children grow up, go to college, and frustrate their professors with their lack of reading skills.
glaurung: (Default)
Every so often, I see another media story about how a new kind of nuclear reactor that avoids all the horrible problems of the ones we have now is being studied, or a startup is looking to build one, or whatever. Don't be afraid of nuclear power, these articles proclaim, there is new, better technology that will resolve all the problems with the bad old reactors. The nuclear technologies I've seen mentioned like this make up a venn diagram of fast neutron reactors, molten salt reactors, and thorium reactors.

And these kinds of stories come from a mix of hucksterism and religion. Just as we have a religion of space enthusiasm (where solar power stations or helium 3 or whatever bullshit technology becomes the rallying cry for the religion's real goal, of having a city on the moon/Mars/in orbit, and having people permanently *Living in Space*), and the religion helps sustain huckster snake oil plutocrats who don't really care about space at all except as a way of siphoning off government subsidies and pumping up stock prices for their space technology companies -- so too we have a religion of Nuclear Power (it's the Future!), and nuclear power technology companies whose plutocrats have snake oil to sell. (PS: my data-free impression is that space enthusiasts vastly outnumber space hucksters, but for nuclear power, the ratio is much more even or perhaps reversed).

Some nuclear power advocates have at least half a leg to stand on (solar and wind are not 24/7/365 power sources, and nuclear power plants *are* a carbon-free way to provide round the clock power regardless of weather). But solar has become SO much cheaper than fossil fuel, let alone nuclear, that it leaves budgetary headroom for adding some kind of power storage to a solar farm and still being less expensive than the alternatives - and solar powered storage (like a lake that you pump full during the day and drain through hydroelectric generators at night) completely avoids all the regulatory and PR hassles of nuclear power.

Other nuclear advocates seem to Want to Believe in nuclear because they are right wing and regard solar as tainted by the leftist eco green conspiracy, or something? IDK.

But after encountering another "the new generation of (insert technical descriptor) nuclear power plants will completely avoid all the problems you've come to expect from nuclear power" article, I decided to try and figure out just how much truth there is to such articles. After picking away at the question for a while, I'm finally typing everything up so my time will not have been completely wasted. The rest of this post comes from reading far, far too many web pages (mostly on wikipedia but also elsewhere) devoted to nuclear power. Read more... )

Having read far too much and gone down far too many rabbit holes, I think I can say confidently that 90% of the claims in articles touting the bright future of new! improved! no longer dangerous or scary! nuclear power are hogwash. Things those articles tout nearly always involve making proliferation-enabling technologies routine (no one other than members of the Church of Nukes want this), assume that technologies still on the drawing board will work out as advertised (they never do), and/or gloss over many, many hard to solve problems. Meanwhile, right now, we already have the ability to just use renewables paired with energy storage. We soon won't need nuclear power anymore, yet somehow there are still scads of acolytes of the Nuclear Church who refuse to accept that their God has become irrelevant.
glaurung: (Default)
I wrote this a while ago on Facebook and utterly forgot to crosspost it here.

American history classes usually peg the start of the telegraph to 1844, with Morse sending "what hath god wrought" from Washington to Baltimore. Actually a bunch of people in Europe and America had been mucking around with sending electrical signals over wires for three decades by that point: Morse's system was merely the first to have great success and go on to be used all over the world.

But the first long distance telegraph networks were not even electrical. Starting in the early 1790's, first France, then Sweden and England built extensive *optical* telegraph networks. Unlike the later electrical telegraphs, which were largely commercial operations from the get go, the early networks were entirely government funded and non-official communications were forbidden. By the early 1800's, countries all over Europe were using optical telegraph systems for sending long distance messages - this being the Napoleonic wars, the networks were mainly used for military dispatches. With Napoleon's defeat, some networks fell into disuse, but others switched to carrying shipping news and other government traffic.

Each network consisted of long series of towers, preferably on hills, with a direct line of sight from one to the next. The first tower would deploy its signals, and someone at the next tower, looking through a telescope, would see them and duplicate them on his own tower for the next tower in line to copy. Each country jingoistically implemented its own bespoke signalling system, even when that meant adopting a slower, harder to use method instead of some other nation's superior system. Nearly all of these networks were for government use only, and paid for out of military budgets.

The idea of an optical signalling system was old - Robert Hooke read a paper to the Royal Society suggesting a system in 1684, but no government saw the need for such a thing until the 1790's. After the revolution, France found itself at war with all its neighbours, and rapid communication became essential. Claude Chappe's semaphore system was just what the revolutionary government needed. It used two vanes on a movable arm. Varying the angles of the vanes and the arm itself allowed a lexicon of 194 signs. Later that was pared down to 94 semographic signs, which were used in pairs to create a code book of 8,000-ish words. During daylight, on days without fog or rain, messages could be sent from Paris to Lyon (500 kilometers away) in 9 minutes. The first line of towers was built in 1792, and within 4 decades France had a vast network of telegraph towers (that leads to a map on Wikipedia). Napoleon loved the system, and built signal lines in conquered territories as well as commissioning portable towers which he brought with him on campaign. Chappe called his invention the semaphore, but the name that stuck was télégraphe.

In Dumas's novel the Count of Monte Cristo, Dantes bribes an optical telegraph operator to transmit false information as part of his scheme to bankrupt his enemy Danglars. In real life, the first ever case of wire fraud occurred in France in 1834 when two bankers bribed telegraph operators to insert some stock market data into their official transmissions, disguised as coding mistakes. The telegraph was also used to prevent fraud, sending out news of the results of the French lottery, closing the gap between the drawing in Paris and the news reaching the provinces, during which fraudsters had long suckered people into buying worthless tickets.

Inspired by France's system, Sweden's Abraham Edelcrantz came up with a binary encoding system that was harder to see, but faster and simpler to operate than Chappe's semaphore signals. Each telegraph tower held a grid of 10 large black shutters which could be open (horizontal/invisible) or closed (vertical/visible). One offset shutter indicated an "A" prefix, and 3 groups of 3 indicated a 3 digit octal code, allowing a code book of 1024 words. By putting lights inside boxes with the shutters at either end, it was possible to transmit messages at night. A mechanical control panel enabled the operator to program the next code group in advance, then one press of a pedal moved all the shutters at once to display the next code group. Sweden's network was the largest after France's, but I haven't been able to find a map. :(

Sweden's network covered 200km in 1809. Most of it went into disuse with the end of the Napoleonic wars, but it was put back into service in the 1830's and continued to be expanded until 1854, when electrical telegraph systems began to take over. The country's fractured coastline and many islands made converting the optical network to an electrical one quite challenging, and Sweden's telecommunications system was a hybrid of optical and electrical from the 1850's through the 1870's. The last pair of optical stations were not shut down until 1881.

In England, George Murray devised a six shutter binary system. The first line was built in 1795, and the network encompassed sixty-five towers by 1808. As in Sweden, the system went into disuse after Napoleon's defeat, but in 1822 the Royal Navy rebuilt one line (from the Admiralty in London to Portsmouth), and after studying the matter, switched from binary shutters to a more easily seen system devised by Home Riggs Popham that, somewhat like the French system, used two semaphore signalling arms attached to a vertical pole. The line remained in service until it was replaced with an electrical telegraph in 1847.

Also in the 1820's, one of the few commercially owned and operated optical telegraph systems started up in the UK, with a line running from Holyhead in Wales to Liverpool, to enable news of incoming ships to reach the Liverpool stock market hours before the ships arrived in port.

Spain's first optical telegraph line went into service in 1800, and by the 1840's the country had three lines connecting Madrid to border and port cities - Barcelona, Cadiz, and Irun (the last is a town on the north coast at the border with France). Spain's very hilly landscape made running electrical lines difficult, and while the first electrical telegraph line began operating there in 1855, the last optical lines continued to be used until 1876.

Prussia came to the game very late, with a single Berlin-Koblenz line opening in 1833 and being replaced by an electrical system in 1849. Two other commercial lines carrying shipping news opened in Germany in 1837 and 1847: both shut down in the early 1850's. Prussia used a binary system like Sweden's, but with 12 shutters.

All of these systems paved the way for later electrical telegraph networks, as the operators of the time had to invent and implement data transmission protocols that are now taken for granted - things like error control, rate control, message priority, and so on.

The incredible speed with which the later electrical telegraph networks were built can at least partly be credited to the earlier optical networks making everyone aware of just how useful fast communication could be. Once a much faster, all-weather, and much more private and secure system came along, governments and industry were all on board despite the high up front costs.

Two last bits: Today's flag semaphore signals are adaptations of the those used by the British Admiralty in their 1830's optical telegraph. And, the many hills in England, Europe, and even parts of America which are today named the local equivalent of "telegraph hill," were all once places where an optical telegraph tower once stood.
glaurung: (Default)
I was avoiding starting an intimidating 4 book series, so I rummaged around for one-offs I hadn't read before, and found myself reading "Stealing the Elf King's Roses," by Diane Duane, published 2002.

The cover screams fantasy. The spine *says* fantasy. For nearly the entire book, it instead felt like a multiuniversal near-future science fiction novel. The only fantasy element is that in this collection of five (plus one newly discovered) alternate universes, Justice is not an abstract concept but rather an aspect of the will of the universe. Each universe has its own "ethical constant," and, in the alternate Earth where we start out, sentencing in the criminal justice system (which has lawyers, judges, and juries, all quite familiar) is carried out not by humans but by Justice herself. If She decides the appropriate penalty for fraud is to spend a period of time as a weasel, then the fraudster gets transformed into a weasel.

The protagonists, Lee (human) and Gelert (madrin, aka a sapient horse-sized dog) are lanthomancers. They have True Sight, which enables them to See (Smell for Gelert) the psychic residue of recent past events. They work as legal lanthomancers, investigating crimes and then prosecuting the accused in court. They aren't part of the DA's office, but rather they and other lanthomancers are subcontractors who get assigned cases by the DA's office in random rotation. Having them be prosecutors as well as investigators is one of the few rough spots in the novel - the novel treats the DA's office as analogous to those in the world we know, ie, a large organization with numerous employees, but all those people seem to have very little to do since all the actual investigations and trial work is shown being done by the lanthomancers.

And aside from that rough spot, this is a well plotted SF-like novel with a fully thought out sheaf of alternate universes. Five of the universes know about and trade with each other using world gates, and interact with each other via an intrauniversal UN-like body. Some are almost but not quite like the real world, others are quite different. Alfen is the world of elves - immortals who are also uncannily beautiful. And the elves have a monopoly on fairy gold - the element from their universe has unique properties that make it essential to building affordable world gates. Once the Alfen sell fairy gold to the other universes, it becomes a hot commodity with futures trading and so on, as the number of inter and intra universe gate projects that would like to use it always exceeds the available supply. On the version of Earth that is closest to the real world, it's 2007, but the technology is much more advanced than that, with hovercars, massive blade runner-esque supersized public buildings, advanced office productivity applications (actually advanced, not "advanced for 2002"), and so on.

What starts out as a police procedural mystery (an Alfen expat working in Los Angeles is murdered) quickly turns into a tale of political and economic intrigue.

Like all Diane Duane novels, it was extremely good and I'm glad I read it. But don't be fooled by the cover: this is a thick layer of SF wrapped around a kernel of fantasy. If you enjoy Duane's mix of sf and fantasy in her young wizard series, you'll enjoy this, although the vibe is very different (those feel like fantasy told in an SF register, this feels like SF in a fantasy register? I think? Something like that). If you don't like your SF peanut butter to ever mix with your fantasy chocolate, then you knew you didn't want to read this the minute I started to describe it.
glaurung: (Default)
Decades ago, I came across volumes 2 and 3 of the Sorcery Hall Trilogy in a bookstore. They were by Suzy McKee Charnas, author of the amazing Holdfast Chronicles, so I bought them instantly. And then they sat on my shelf unread for ages as I never got around to buying volume 1.

Earlier this year, Charnas died, and I decided it was time to fill the holes in my collection of her books. I finally got volume 1, and I also got something called "The Kingdom of Kevin Malone." And today, after many delays, I have finally finished reading the last of them. the good stuff )

And then picked up "The Kingdom of Kevin Malone," published in the early 90's, and... I'm not at all sure what to think. It's a portal fantasy - young Amy is roller skating in Central Park when a boy pins a brooch to her sleeve. She recognizes it as one stolen from her by a neighbourhood bully years ago, and recognizes the boy as the bully himself. She chases him down the path and through one of the pedestrian underpasses that dot the park. On the other side of the tunnel, though, is a fantasy world, with a quest and a prophecy in which she plays an important role. a book I have complicated feelings about )

It's a profoundly dissatisfying novel, and I'm not completely sure if Charnas intended it that way, or if the power of the tropes she was using prevented her from writing a better, more subversive ending.
glaurung: (Default)
The unmitigated pedantry blog recently posted about the "long peace" - the much reduced frequency of warfare, first in Europe, then worldwide, that started in 1815 with the end of Napoleon and has lasted (with a few interruptions) until now. The pedantry blog spoke approvingly of Azar Gat's work in providing an explanation of the Long Peace, and disapprovingly of Steven Pinker's bestseller "The Better Angels of Our Nature." I knew of the Pinker book but had never read it, and had never heard of Gat. I was intrigued by the explanation proffered in the blog, and decided to read Gat and Pinker. Which I have now done, and... Gat's argument about the long peace is interesting, but it's embedded in a steaming pile of authoritarian, fash-apologist garbage. Pinker's book makes almost exactly the same overarching fash-apologist argument, but with extra helpings of racism. What follows is my response to both author's discussion of human nature and war in prehistory. Their arguments about the "long peace" will have to wait for another post. This got long. Read more... )
glaurung: (Default)
I recently finished reading "The Gunpowder Age" by Tonio Andrade. It's a book about the history of gunpowder as used in warfare in China (with bits about its use in Europe, provided mainly for context and comparison).

It's also a book about the racist, colonialist myth that guns are A European Thing, that China may have invented gunpowder but never used it in warfare, or that China may have invented guns but they never actually went very far with them and their gun tech was always hopelessly outmatched by superior Western guns. Andrade doesn't call these myths out as the racist/colonialist garbage that they are... but he does demolish them quite thoroughly, by going to both Chinese and European primary sources (and making it clear that secondary sources have long been and continue to be woefully poor when it comes to the history of gun technology. Even some Chinese historians have bought into the myth). In other words, by way of looking at gun tech, Andrade is attacking the orientalist idea of China being "stagnant," "decadent," or otherwise somehow innately inferior to Europe/backwards compared to Europe.

Andrade counters this myth with a counter-narrative that Chinese stagnation happened only during periods of (relative) peace and unity. The land that is populated by Chinese speaking people has sometimes been completely unified and at peace, but at other times (especially during dynastic succession, but also during dynasties that failed to achieve hegemony) it splintered into smaller but still quite powerful states, each more-or-less seriously interested in conquering the others. He argues that the existential threat of conquest drove R&D for new weapons, whereas periods of unity and peace led to military complacency and a significant slowdown of military research. Europe, in contrast, never had *any* periods of peace, so, once gun-having Europeans arrived in East Asia during the era of colonial expansion, from time to time China would lag behind Europe in gun technology, only to catch up once a new period of heightened warfare began.

The colonialist meme that China was decadent and weak solidified in the 19th century, during one of those long periods of peace and stability within China. When the threat from Europe began to prod China to resume military investment in the 19th century, they found themselves behind not just in the realm of weapons, but in a vast array of interlocking realms due to Europe's burgeoning industrial revolution. The Self-Strengthening Movement and the Tongzhi Restoration were able to bring China up to par with Europe and Japan for a time, but they did so only through the action of individual ministers, rather than a systematic, long-term government program - when the ministers died or ceased to be in favour, China began to fall behind again, and at a time when military technology was advancing faster than ever.

This was a good book, and I will now go on at some length into the details that I found most interesting. Read more... )
glaurung: (Default)
One year on, and I am still seeing articles in leftist media focusing on the role the US has had in preventing peace talks and blowing up Russia's pipelines. Articles which then go on to say that, because of the horrible risk of a full scale atomic war breaking out between NATO and Russia, there has to be a peace agreement, if only a trustworthy 3rd party could negotiate one. (tears hair).

https://www.counterpunch.org/2023/02/15/hersh-the-us-and-the-sabotage-of-the-nordstream-pipelines/

https://www.counterpunch.org/2023/02/16/how-spin-and-lies-fuel-a-bloody-war-of-attrition-in-ukraine/

Yes, of course, the best solution is always to stop the killing and use diplomacy. But sometimes, diplomacy cannot and will not work because one or both sides refuse to negotiate, or are only going to approach negotiations in bad faith. Sometimes you have to have a fucking war and people have to suffer horribly and die needlessly, because the alternative is to let an imperialist dictatorship expand their territory, build up their empire, and create even more suffering and death (and ethnic cleansing and genocide) than any war ever could.

The butcher's bill for World War II is hard to conceive, but the toll that a German empire would have created if it had been allowed to carry out its plans for Lebensraum? Would have been worse. I am not sure what is causing today's leftists to think that negotiating peace with Putin would have a different outcome than it did 80 years ago with a different imperialist power that insisted on conquering its neighbours, but I really wish they would stop.

https://en.wikipedia.org/wiki/Peace_for_our_time

And yes, America, as the dominant imperialist power in the world, is happily taking advantage of the war to pursue their own aims. But you can't right the wrongs America continues to commit by letting Russia gobble up more parts of Ukraine, which is what a negotiated peace settlement would mean. And, because evidently it needs to be repeated endlessly until the leftist media stop acting like the only actors that matter are America and NATO: Ukraine is an actor in this war, and they have opinions and desires that matter:

https://truthout.org/articles/a-ukrainian-socialist-lays-out-the-aims-and-struggles-of-her-countrys-left/

The only way to make an empire give up the territory it has conquered at the negotiating table is by arm twisting... and the fundamental problem is that, it's impossible to arm twist a country that has enough nuclear weapons to destroy civilisation on Earth. The only other way to get a peace agreement is for enough death and suffering to happen that both warring sides decide they want the war to end. And, unfortunately, that seems to be a long way off.

After going on about a lot of military nuts and bolts stuff, here's the conclusion of the Collection of Unmitigated Pedantry blog's post last week:

https://acoup.blog/2023/02/24/collections-one-year-into-the-war-in-ukraine/

All of which is to say that unfortunately I do not see the war as being likely to end any time soon. Putin’s remains determined to carry the war through to a conclusion and indeed politically he probably cannot do otherwise, having backed himself into a corner with his annexation of Ukrainian territory he doesn’t control. Meanwhile Ukraine isn’t going to bargain away at the peace table territory that they could still win on the battlefield. There’s a psychological aspect to this as well, I suspect: it would be a hard ask for most Ukrainians at this point, after experiencing the cruelty and brutality of Russian attacks against civilians, to let the Russians ‘win.’ Human beings are willing to absorb a lot of hardship and suffering if it allows them to punish the people causing that hardship and suffering.

Consequently, so long as Putin remains in power – and there is little prospect of him being removed – Russia is unlikely to negotiate in good faith to end the war on terms that would be acceptable to Ukraine (as Ukraine is not going to give up towns and cities they’ve held or recaptured just to end the fighting). Meanwhile, as long as Ukrainians believe they can make gains on the battlefield both to secure territory (and protect the people in that territory from Russian atrocities) and to avenge the damage they’ve already sustained, they are unlikely to be willing to negotiate for anything short of a full Russian withdrawal, which would be politically fatal to Putin and thus unacceptable to him. So while I hope everyone is thinking about potential war termination scenarios, in practice the preconditions for an end to the conflict are likely far off.

And unfortunately that is where we stand a year in to the fighting. Russian forces have largely failed on the battlefield, losing territory consistently since April (but still a net gain compared to the January 2022 lines). Given the severity of Russian losses and the geopolitical consequences for Russia, I think it is fair to say that in a sense Russia has already lost. The question that remains, one year on, is if Ukraine can win and how bad the damage will be once the war ends.
glaurung: (Default)
New York City's office of Emergency Management has issued a surreal and brainless PSA about what to do if a nuclear bomb goes off in the city. Let's try a youtube embed:



Set aside the chirpy presenter and the outlandish assumption that there will still be broadcasts/phone service or internet for people to stay tuned to after a bomb goes off.

If there was an actual nuclear war, either NYC would not be targeted at all (because Russia would instead be using its nukes against naval bases, air bases, missile silos, and military command centres), or it would get one bomb, which would destroy essentially the entire city, and there wouldn't be much of anyone left alive to worry about how they were supposed to tune in (Check out the map halfway down this page, showing the area of a firestorm after a standards size Russian nuke going off above Manhattan).

The PSA seems to instead be contemplating a terrorist nuclear attack using a regular A-bomb instead of an H-bomb. Fearmongers ^H^H^H security "experts" have been beating the drum about how terrorists would love to obtain a nuke and set it off in a major american city for 20 years now. My library card let me look at this paper from 2009, written by one of those "experts," which cites 2007 congressional testimony by Richard Garwin, a "true genius" who thought there was a "20 percent per year probability of a nuclear explosion with American cities and European cities included," and also cites Matthew Bunn, who estimated in 2006 "the probability of a nuclear terrorist attack over a 10-year period to be 29 percent." That means over 10 years, the first expert thinks there is a 90% probability that terrorists will blow up a city, and in the two decades since 2001, the second guy thinks the odds of a city being nuked are around 50% (see formula note below). Something seems wrong with their figures. Could it be that they don't actually have any idea of the risk and are making up high numbers in order to make things sound more scary so they can continue to make money beating the drum over nuclear terrorism? Nah, couldn't be.

Regardless, I got some maps for the destruction from a very basic nuclear bomb - the Little Boy a-bomb dropped on Hiroshima, which shot a chunk of purified U235 into a cavity in another chunk of U235. The simplicity of the design was such that the Manhattan project scientists never bothered to test it - they were certain it would work. It's possible to make a much smaller bomb (eg, the minimum yield on dial-a-yield H-bomb used in most American nukes), but that involves using more advanced techniques. For a bomb built in a garage by people without decades of expertise in bomb making to draw on, we can expect your basic Hiroshima or Nagasaki H-bomb of 15 or 20 kilotons. Here's the area destroyed by fire at Hiroshima. And here's a nukemap projection of the result of setting off the same bomb at the ground level of the Empire State Building. The 50% chance of 3rd degree burns circle there (1.8 km) is the same radius as the "2000 yard" circle in the Hiroshima map.

Basically, assuming the same size firestorm, a terrorist nuke set off in the street outside the Empire State Building would cause the entire width of Manhattan Island from 5th to 55th street to be destroyed by fire. And the staggering scale of that - from the most basic bomb that could plausibly be made in a garage by terrorists - is why all those "experts" have been so wrong for so many years. Because despite what American terrorism "experts" think, terrorists are not monsters intent on killing as many people as possible. They are politically savvy people using acts of violence to create a political effect. Terrorism is, basically, propaganda through violence. And even if they could get their hands on some plutonium or pure U235 (which would not be an easy or cheap undertaking), they know better than to do so. They know that the propaganda effect of setting off a nuke in a city would not help their cause. So they haven't tried.

But, the fearmongers continue to beat their drums, the Biden administration has not reversed course on Trump's baseless scapegoating if Iran, and the NYC office of emergency management has turned out a tone deaf and ludicrous PSA.

(Footnote: to calculate the odds of something with a per year chance happening over several years, this page says for the probability p, (1-p) to the N = the chance of it not happening at all over N years. The chance of it happening at least once is the inverse (subtract that result from 1 again). So, convert annual percent chance to decimal, subtract that decimal from 1, then raise that to the power of the number of years, then subtract from 1 to get the chance of it happening over the longer time interval.
glaurung: (Default)
My mother has a crapton of pre-digital photographs that she would love to have on her computer. I tried scanning some of them with my flatbed scanner. Hours later, I had made a tiny dent in a shoebox full of photos. I needed something faster.

I did research. The wisdom of the internet said "get a document scanner that can do photos." The Epson Fastfoto was mentioned, but it costs $600 - far beyond my or my mother's budget - and when I looked for actual, non-SEO, non-astroturfed reviews (by googling "epson fastfoto reddit"), the consensus was that it produced poor quality images, especially if the picture was dark.

Fujitsu scansnap scanners were mentioned as an alternative. After much more research, and on the basis of reddit comments and a review that certainly seemed to be real, with claimed actual testing of its photo scanning abilities that concluded it produced good quality scans, I bought a scansnap ix1600 earlier this year. Instead of $600, it was $400 - just barely affordable (my mom and I split the cost). I tested it, it worked, and I set it aside. This month, I finally got around to using it on a shoebox of miscellaneous pictures.

The results were... DIRE. Read more... )

In sum: Once again, capitalism has ruined everything. If you want to scan a ton of photographs without spending a fortune, find a working old scansnap s1500m. And then hit me up for the drivers - fujitsu says they will take the drivers down from their site at the end of this year (the drivers will live on on other, more ad-filled sites, of course).
glaurung: (Default)
I've been having Thinky Thoughts about the racist, colonialist gatekeeping that goes on in archaeology and anthropology around which groups of people get to be called "agriculturalists" and which end up labelled as "hunter gatherers". Sources like wikipedia say the distinction is important because agriculture enables settled living, with higher population densities, and that in turn enables craft specialization, surplus resources, long term infrastructure, having nicer things, and so on.

But there are so many groups of people who did live in permanent or semi-permanent settlements and were able to have some or all of those knock on benefits, but who continue to be classed as "hunter gatherers" because they didn't *farm*. Or they didn't *plant crops*, even though they did maintain and harvest vast stands of edible wild plants. As research continues to turn up more and more examples of people who weren't nomads living sparsely in small groups and collecting only naturally occurring food sources, the "hunter gatherer" category looks more and more like a catch all for "anyone who doesn't make a living like our wheat growing ancestors."

The gatekeeping isn't only about restricting admittance to the sacred precincts of the "agriculturalist" club, it's also about preventing the creation of additional in-between labels to properly encompass the spectrum of strategies humans have used to feed themselves other than farming. Maintaining the dichotomy is vital to preserving the specialness of the agriculturalist in group and the subaltern status of everyone else. Instead, you find half-assed labels like "enhanced," "complex," or "affluent hunter gatherers." Even though the adjective and non-adjective groups share little in common apart from not being farmers. And even though the closer you look, the fewer regular food collecting cultures there were compared to (pick adjective) cultures.

Read more... )

I originally planned to say something about forest gardens and some of the other people in the world who get classed as (adjective) hunter gatherers instead of farmers or farming-adjacent, but this post has grown too long, so there will be a part 2.
glaurung: (Default)
I wrote these reviews over a year ago but never got around to posting them. think I've pretty much run out of GWG films worth talking about. From worst to best:

Fox Hunter (1995)Read more... )

Outlaw Brothers (1990)Read more... )

The Godfather's Daughter Mafia Blues (1991)Read more... )

Queen's High (1991)Read more... )
glaurung: (Default)
Here at last is the third, very long, final part of my three part takedown of Incunabula's viral thread from early September.

To recap: there was a thread on Twitter by Incunabula that went viral last month. In James Burke-ian Connections style, Incunabula said that "Cheese ?? is one of the 5 things the Western book as we know it depends on. The other four are snails, Jesus, underwear and spectacles." The history in that thread ranges from over simplified and inaccurate, through simply wrong, to a disingenuous lie. Today I'm mostly covering the part about "snails" aka Phoenicians, alphabets, and how printing would never have taken off if Gutenberg had used a syllabary or a logosyllabic script instead of an alphabet. Which is built on a lie.

Part one: cheese and Christianity/mostly over simplified and inaccurate. Part two: underwear/simply wrong. The original viral thread that I am laboriously beating to death.

Read more... )
glaurung: (Default)
To recap: there was a thread on Twitter by Incunabula that went viral a few weeks ago. In James Burke-ian Connections style, Incunabula says that "Cheese 🧀 is one of the 5 things the Western book as we know it depends on. The other four are snails 🐌, Jesus ✝️, underwear 🩲 and spectacles 👓." When I re-watched James Burke's "Connections" a year or so ago, in the places where I actually knew some of the history he was covering, mostly I just found myself thinking, "that's way oversimplified and leaving out a lot in order to sound neat." But sometimes, I would think, "that's just not so." This post covers one of the "that's just plain false" parts of Incunabula's highly Eurocentric and inaccurate thread.

Read more... )
glaurung: (Default)
Yes, I know this thinky is not about snails/underwear either. I'll get to that, I promise.

You remember the story that was all over the news last week about how a comet/asteroid had blown up 3500 years ago over Tell el-Hamman on the shore of the Dead Sea, destroying the city, and this was covered as the source of the biblical legend of Sodom?

Well, buckle your seatbelts, it's going to get bumpy. The article in question, "A Tunguska sized airburst destroyed Tall el-Hammam a Middle Bronze Age city in the Jordan Valley near the Dead Sea" was published in Nature Scientific Reports (*not* Nature itself, as was often misreported). Nature Scientific Reports is Nature's far less prestigious open access counterpart. The article says it's based on fifteen years of annual excavations, which in the world of archaeology is quite a lot: someone must be very well funded.

The owner of the Slacktivist blog noticed this line in the opening section of the article: (the excavation project is) "under the aegis of the School of Archaeology, Veritas International University, Santa Ana, CA, and the College of Archaeology, Trinity Southwest University, Albuquerque, NM." Both of those schools are bible colleges. Trinity Southwest is proudly unaccredited. Veritas International is accredited by the Transnational Association of Christian Colleges and Schools (ie, an association of bible schools that wanted to be able to say they were accredited).

Trinity started out as an in-person seminary in Oklahoma in the 80's. After moving to New Mexico and becoming a distance learning school, they affiliated with an unnamed overseas "internationally-known Bible college and seminary," then declared themselves a university in the early 90's. Although primarily doing distance learning, they say they offer in person classes as well. However, they have no campus: what physical locations they have are scattered along Journal Center Boulevard in Albuquerque. Their website is pretty minimal. Their library doesn't get a dedicated section on the website and just barely a mention in the student catalog that it exists. They do have their own press, though. And, they offer tours of the Holy Land for $4000 to see the sights, or $5000 if you also want to visit their archaeological dig at Sodom. Stuff they dig up at "Sodom" (aka Tell el-Hamman) goes on display in their very own archaeological museum in Albuquerque.

Veritas was founded as a seminary in 2008, and only decided to call themselves a university in 2017. Veritas's campus, as best I can tell from perusing the catalog, is one building, and their library has all of 4,000 paper books (unsurprisingly, the website gives a lot more emphasis to their digital resources). Library users are asked to bring their own computers, so the library doesn't really have terminals. They offer several doctor of divinity degrees, but just one PhD program: in biblical archaeology.

The websites of both of these schools are at pains to put their best face on and pretend that they are real institutions worthy of the name university despite not actually being anything like that. Trinity's campus is scattered, but there is no map in the student catalog or anywhere on the website showing where things are. While I think they don't actually offer in person classes except in a very minimal way, they do claim to have several physical resources and in person classrooms, and since those are not all in one place, there needs to be a map. But posting it to their site would be too much an admission of just how small and inconsequential they are. In the same vein, their catalog doesn't seem to differentiate between distance learning courses and actual in person classes. I think it's safe to say that they don't have many full time faculty other, perhaps, than a gaggle of distant adjuncts tasked with interacting with the distant learners who have been paying $250-ish per credit hour to support fifteen years of excavations in "Sodom."

Neither school provides basic academic information like the number of instructors or the number of students anywhere I could find. Also, unsurprisingly, neither school's website has a single word to say about COVID that I saw.

But, wait: there is more. The Tell el-Hamman paper has a very long list of authors. Only the last, Phillip J. Silvia, works at Trinity Southwest; all but one of the rest are affiliated with real universities, or else with real research laboratories. What's up with that? Turns out there's a second fly-by-night organization here, the "Comet Research Group." They get called out in the paper's acknowledgements for funding the research (as opposed to the excavations) behind the paper, and Allen West, the second to last author of the paper, is one of the CRG's founders.

The CRG is all about finding evidence that ancient comet/asteroid impacts caused local or global catastrophes. Before the Tell el-Hamman paper, they made a splash a few years ago with a proposal that the Younger Dryas, a thousand year cold snap that happened right at the end of the last ice age, was caused not by a shutdown of the Gulf Stream in the North Atlantic, as is usually thought by ancient climate researchers, but by the impact on Earth of a swarm of cometary fragments, Shoemaker–Levy 9-style, that caused widespread destruction, a nuclear winter that extended the ice age by a thousand years, mass extinctions of megafauna, population collapse of early humans in the Americas, and so forth.

The Younger Dryas impact hypothesis has been taken seriously by climate and ice age researchers who investigated its claims and found them mostly without merit. The only thing that seems to have come out of it is that there's a spike in platinum residue in sediment layers that are deemed to mark the start of the YD, in some locations, and that platinum could point to an asteroid or comet impact happening somewhere near the start of the YD... but the layers being pointed to as marking the start of the YD are not all the same age, there is no agreement that these identified layers are all in fact indicators of the start of the YD, the signs being found in those layers that are supposedly evidence of an impact, other than platinum, are all very debatable, and so on.

What's odd is that the CRG has not responded to the criticism and critique of their hypothesis by the scientific community in the usual way (going back to the drawing board, trying to find new evidence, pruning away some of the more extreme claims in their hypothesis and saying surely we can agree on this part, etc), but rather by refusing to share their samples and data with people who they deem to be "on the other side" of the debate. Much more about that here, and on Mark Boslough's twitter (see below).

The long list of CRG "scientists and members" on their website includes co-authors of papers who have otherwise had nothing to do with the CRG, as well as people who were not asked if they minded being listed as CRG members, and when they found out they'd been so listed, were upset at being included. Very classy.

And finally, one of the co-founders of CRG, Allen West, is not actually an academic, does not actually have an advanced degree, and has in the past, under a different name, been convicted of selling fraudulent water studies to California municipalities despite not being a geologist. So, a con artist passing as an academic and geologist who befriended the other co-founders, became infected with their obsession with cometary impacts, and proceeded to reinvent himself as a cometary impact specialist.

An earlier version of the Tell el-Hamman paper appeared a few years ago. That paper, "The 3.7kaBP Middle Ghor Event: Catastrophic Termination of a Bronze Age Civilization" was a conference presentation at the 2018 annual meeting of the American Schools of Oriental Research. The ASOR (now called American Society of Overseas Research because someone realized their old name was racist, but they still publish a bulletin and hold an annual conference under their old name because, hey, still racists) dates back to 1900, when they were called the American School of Oriental Study and Research in Palestine. In short, they are the big leagues in the small (and sometimes dubious) field of biblical archaeology.

The ASOR presentation was by just 4 authors: Silvia and Steven Collins are both faculty at Trinity Southwest University. Ted Bunch is co-founder of the Comet Research Group, and finally Malcolm Lecompte, odd man out, is an emeritus faculty member at Elizabeth City State University in North Carolina. Lecompte has a website: he does not list the ASOR paper in his vita.

Looking at all that, it seems that the CRG, an organization devoted to proving that comets killed the mammoths and extirpated our distant ancestors 13,000 years ago, is Very Good Friends with a bunch of young earth creationist "archaeologists." But the CRG seems to have a rather Trump/Republican approach to science: declare your dubious findings as proven, then label anyone who disagrees as an enemy and refuse to cooperate with them. So, maybe not such strange bedfellows after all.

There's a lot more about the Tell el-Hamman paper's shoddy research and dubious claims to be found in Mark Boslough's twitter account (Boslough is an *actual* asteroid impact researcher). (Boslough is just one of many who are tearing their hair out over the paper, thanks to Robin Reid for bringing his twitter threads to my attention). Unfortunately Boslough has been posting his thoughts in several short threads, and not always remembering to link them together. Here's a starting point, but you may not always get continuation links (I ended up going to his main feed and scrolling down to find the next thread, but I started at a different point and he may have gone back and fixed things since then).

And this concludes our journey into the realm of fake science getting published in real journals and covered as legitimate.
glaurung: (Default)
Just a short gruntle about blind spots in archaeology today. I will do another post about the annoying viral twitter thread by Incunabula soon.

Archaeology is a very vexing science. On the one hand, it's amazing that we are able to figure out so many things about humans who lived tens of thousands of years ago. On the other hand, archaeologists are so timid in their approach, so unwilling to commit to any conclusion that they cannot prove by means of the stones and bones they dig up, and so wedded to certain theories that give them huge blind spots and force them to propound absurd conclusions, like the one underlying the dates on the arrows in the Americas for this map from Wikipedia.

We know that around 60,000+ years ago, ice age humans built boats and navigated across the water between Sunda (southeast Asia plus some of Indonesia) and Sahul (the rest of Indonesia and New Guinea/Australia). Ice age humans knew how to build boats. We know this for certain, because the people of Australia and New Guinea arrived there at a time when there was ocean between them and the rest of Eurasia, which they had to cross.

But change the context to the Americas, and suddenly the fact that humans were building boats and going around on the ocean 60,000 years ago vanishes utterly, and the conventional archaeological view is that the Siberian people who would go on to become the original Americans were land-dwelling people who walked across Beringia (the land between Siberia and Alaska) to North America 25,000 years ago, then cooled their heels in Alaska for 10,000 years until an ice-free route through the Rocky Mountains opened up, allowing them to walk, or rather, sprint, overland from Alberta to Tierra Del Fuego in less than 2,000 years.

This despite the fact that in historical times, with much more moderate weather, people who make a living in the arctic have often been boat-using people who get much of their food from the sea. If post-ice age people found the best way to survive in the far north was by going on the water, fishing and hunting whales and seals, why are we expected to believe that ice age people, facing a much harsher climate, would limit themselves to the food they could find on land?

Part of this absurdity - the idea that ancient Americans only arrived south of the ice sheets 16,000 years ago - is finally starting to crumble under the increasing weight of evidence for a much older human presence in the Americas - from the 23,000 year old footprints of children found in White Sands National Monument (published just this week, sadly paywalled), to 30,000 year old tools found in caves in Mexico (open access version), and on to many more studies going back decades. The archaeological community is very good at straining at gnats and swallowing camels, attacking any evidence that contradicts the accepted conventional theory as being misdated, misinterpreted, or not actually of artefacts made by humans at all. In other words, these findings of people in the Americas 20,000+ years ago are still, sadly, controversial.

And it doesn't have to be this way. We know that humans built boats, even if none of those boats have survived to be dug up. We know that the sea shore and shallows are one of the richest habitats on the planet for food-collecting peoples - a fact obscured by the way that those who collect food from the sea have been labelled as "fishers" while land based food collectors are called "hunter gatherers."

There is no way. No. Way. that inland dwelling food collectors of 60,000-ish years ago walked to the end of the land in Sunda, built boats to get to Sahul, and then walked away from the shore inland. They were fishers and boat people, they lived on the shore. They travelled up rivers, looking for more good fishing spots, settling on lakes and marshes away from the shore, and gradually over time, the descendants of those fresh water boat people adapted to a fully land-based lifestyle and filled the interior of the continent.

We know that the sea today is around 120 meters higher than it was during the ice age, which means that all of the homes used by the sea peoples of 60,000 years ago are now buried deep under the water, but that doesn't mean they did not exist. If ice age people were building boats in Sahel 60,000 years ago, why are we expected to believe that ice age people 25,000 years ago in Siberia did not know how to build boats and walked to Alaska, then were stuck there for ten thousand years waiting for a clear walking path through the ice to the rest of America?

Thankfully, some archaeologists are more sensible than the bulk of their colleagues and have begun advocating for the initial peopling of the Americas to have been done by boat people who, having fished and sealed their way along the coast from Siberia through Beringia to Alaska, skirted the ice sheets covering the shore of the Northwest Coast, hopping from one ice-free cove or island to another until they got south of the ice and had clear rowing (or sailing) to warmer parts further south.

I find it far more believable that a boat-based, marine mammal hunting and fishing people just kept going south in their boats, filling the coast throughout both continents, then gradually moved inland, than the mainstream theory that the entire continent was filled virtually all at once by land-dwellers who sprinted through a dozen different climates and biomes, their population exploding while they were constantly on the move, on foot, ever southward.
glaurung: (Default)
Eurocentric history tends to be very self-congratulatory. "(European guy) invented (arguably very important technology)," and "(historical change of import) happened because of (thing european or Greco-Roman people did)." It gets very tiresome, especially when the technology in question was actually invented hundreds or thousands of years earlier, far outside of the Euro-Greco-Roman sphere.

Today's example: a thread by Incunabula on Twitter. In James Burke-ian Connections style, he says that "Cheese 🧀 is one of the 5 things the Western book as we know it depends on. The other four are snails 🐌, Jesus ✝️, underwear 🩲 and spectacles 👓." Sigh. Burke's "Connections" blew my mind when I was eight. Because I was too young to notice how a shallow, facile and simplified, all white, mostly male narrative was being constructed from a far richer and larger history.

Today I'm tackling two of Incunabula's five things: cheese (parchment) and Christianity (the adoption of the codex). My primary source for most of what follows is an online version of "The Birth of the Codex" by Colin Roberts and T C Skeat, supplemented by Wikipedia and lots of blog posts/articles found via google. The online version of Birth of the Codex includes (in green and red text) incomplete edits and updates of the original book

1. "Cheese" (aka, parchment/vellum):
Read more... )
How much did Christianity's extremely early and wholehearted embrace of codexes have to do with the switch from scrolls to codexes? Not a lot. Christians were a tiny minority in a vast empire until well into the 3rd century. By the time Christianity became a major force in the empire (~300 AD), the switch to codexes was already underway. Clearly the rise of Christianity to the empire's official religion in the 4th century greatly accelerated the transition, but the evidence doesn't point to Christians being the starter of the trend.
glaurung: (Default)
When the "cities plant male trees because they don't want to deal with cleaning up fruit, and this makes allergies worse" meme made its rounds a while back, I didn't think much of it, but now that I've read this thread, I went and refreshed my memory about tree sex.

The meme is wrong, but the thread linked above swerves too far the other way and ends up wrong as well.

Actually, species that have separate sexes make up about 1/5 of the trees in the Eastern US. When planting trees of those species, cities may choose to go with male trees only, but probably don't bother unless the fruit/seeds are large and messy but the trees themselves are desirable (selecting for the gender of tree adds to the cost, and cities are loath to spend more money when they can avoid it by just picking a different, non-messy species or not bothering to screen their plantings for sex).

Things to remember:
1. Plant sex is complicated.
2. Dioecious is a fancy greek term for species that have separate organisms for male and female reproductive functions. Nearly all animals we're familiar with are dioecious. Among plants, dioecy is relatively rare. The "urban male trees are responsible for your allergies" meme is talking about dioecious species, which are a minority among trees (1/5 in the Eastern US, 1/20 worldwide).
3. The only pollen that's an issue for those with allergies is pollen from wind-pollinated species, which would be many trees, most grasses, and various herbs (Flowers that look pretty are not wind pollinated). The pollen from male dioecious trees that have been deliberately planted because they are male and will not litter the ground with fruit is a drop in the bucket of the overall pollen load.

Nerdy tree sex tidbits:
4. Non-dioecious plants cover a lot of variety. They can produce one flower that performs both sexual functions (cosexual). They can have separate pollen and seed producing parts on the same plant (monoecious). Or, just to be different, they can have some combination of cosexual and monoecious bits on the same or separate plants (polygamous).
5. monoecious and polygamous trees cover the gamut from having sexual phases of life (eg, making pollen when young and seeds when mature), to switching from making pollen to making seeds and back again every year or every few years, to being boring and producing both each year.
glaurung: (Default)
The default free software recommendation for cloning a hard disk, Clonezilla, is actively and aggressively user hostile, like many programs from the world of Linux. But it's free, and comes with the special sparkle of being open source, so it gets recommended a lot.

This has led me to have many bouts of swearing and struggling with balky upgrades than I can count. But it seemed all the user-friendly alternatives that I could find were for-pay products that haughtily refused to clone a boot disk unless you forked over tons of money for the pro version.

Now, I have finally, finally found a free trial app that lets you clone a system disk, with no fuss or bother about size, without having to jump through an intricate dance of copying each individual partition and hoping it would still boot when all was done.

Disk Genius is one of those rare birds, a for-pay app where the free version is actually extremely useful and capable. It's almost never recommended for partition wrangling and cloning, because it's marketed as a data recovery app... but what it actually is, is a universal, do-everything tool for dealing with computer storage devices and the data on them, including cloning, partition resizing, and recovering lost data, lost drives, etc. And now, finally, I can tell clonezilla and its user-hating interface to not let the door hit its ass on the way out.
Disk Genius: one of those apps that you wonder where it was hiding from you all these years. A+. Highly recommended.
glaurung: (Default)
Another thinky post. This one will be short. First, a shout-out to [personal profile] conuly who kindly drew attention to my error in the last post about food insecurity, and also gave me an invaluable new vocabulary term: "Food Collecting Peoples" instead of "Hunter Gatherers" does away with the "hunter" label that brings up sexist and inaccurate ideas about how those people lived.

I made a mistake in the first "extra stuff" note I put in the comments to the last thinky post about the invention of agriculture and whether inequality and war are necessarily linked to "civilization."

It's a very common assumption that food collecting peoples live a more precarious and food insecure existence than farmers - that they are more in danger of starving to death. This has generated not just harmless mistakes like my footnote but scads of bad science based on this assumption, such as the harmful "Thrifty Genotype" hypothesis among dieticians which assumes that gaining weight gain happens because humans are adapted to survive alternating waves of feast and famine, and thus self-starvation through dieting is the only proper way to address obesity (the more I learn about diet and obesity science the more I learn just how wrongheaded and discriminatory the entire discipline is).

In fact, food collecting peoples were *not* more prone to suffer food insecurity than farming peoples. Three separate analyses using the Standard Cross-Cultural Sample (TIL that there is a standardized data set for making cross cultural analyses, which is tailored to eliminate similarities due to cultural borrowing by limiting itself to cultures that are widely separated in space/time) found that this is in fact not the case. Comparing across all cultures in the sample, there's no difference in food insecurity between farmers and food collectors. If you control for climate (because arctic food collectors like the Inuit *are* more food insecure and in danger of starvation), then food collectors are *less* likely to be food insecure than farmers. Farmers are tied to their land and at the mercy of whatever happens to their crops, but food collectors can pick up and move to a different area, or simply switch to a different food source that was not impacted by the drought/flood/whatever (here is the only open access article of the three. It's the most recent and footnotes 19 and 20 link to the other two articles. CW: the article's discussion centres the obesity research angle).

The myth of food collectors' food insecurity is mostly born of prejudice (the assumption that life "in the state of nature" was "nasty brutish and short" dates back at least to the 17th century). Some of it is due to selective noticing of the data: famine and food insecurity was at least sometimes an issue for both food collectors and farmers. And some of it traces back to the artificially created food insecurity of people under colonialism and post-colonialism, which the colonizers always blamed on the victims rather than admitting their role. (having your land arbitrarily chopped into blocks that you're not allowed to go onto is not good for food collectors, even before we add the colonizers actively murdering them). Even today, most google hits for "hunter gatherer food insecurity" are papers and articles about how *former* food collecting cultures are suffering food insecurity now that they have largely ceased their own collecting practices and come to rely on food distribution by the nations in which they live. And, finally, at least in my case, some of the myth is due to wrongfully applying the special case of arctic peoples (the one climate where agriculture is impossible and food collectors do suffer from increased food insecurity) too broadly - I remembered reading about how Nanook (of the 20's documentary Nanook of the North) starved to death a few years after the film was made, which was explained as not unusual among the Inuit, and I took that as confirmation that I could accept the received myth and didn't have to google yet another fact.

So, with that myth debunked, why, exactly, did food collecting people switch to farming over most of the world? Farming is measurably worse by almost every metric: more work for the same or greater food insecurity, with more disease, worse nutrition (from a less varied diet), shorter lifespans, shorter adult stature, etc.

That farming made you more susceptible to disease was not evident to premodern people lacking tools to make statistical analyses, but at least some of the consequences of farmers' ill health *were* visible. An example off the top of my head (from the book 1491 by Charles Mann): early European accounts of First Nations people mentioned how healthy, tall, robust, and handsome Indians were compared to the malnourished, disease ridden Europeans. Most of the Indians in question were farmers themselves, but they had a broader, more nutritionally complete set of crops and, without domesticated animals, they were relatively disease-free. If the bigoted Europeans noticed and commented on the difference between better-nourished, disease free First Nations farmers and themselves, then food collectors must have noticed differences in health between themselves and the farmers whose technology they adopted.

First, in some places, food collectors didn't switch so much as get assimilated by farmers who moved into the area - genetic analysis of human remains from central Europe shows that the switch from food collecting to farming involved a genetic change, with an influx of people showing some degree of Anatolian ancestry moving in with their farming technology and mating with the local food collectors. But in other areas that genetic shift does not occur (the food collectors of the Baltic states, for instance, adopted the agricultural technology but did not interbreed with the people that brought it to them).

Second, depending on how violent that assimilation process was, people like those Baltic food collectors might have adopted farming in self-defence, regardless of the downsides.

Elsewhere, for instance in North America, there's clear evidence of agricultural technology diffusing without attendant migration, so: no assimilation or threat of assimilation. Why switch to a food system that required more work, had visible negative effects on the people who adopted it, and provided no real improvement in food security?

One common answer is that they were forced to by population pressure. (eg, Jared Diamond) This is Malthusian bullshit (another thinky post about Malthus being completely wrong will happen someday). Humans have always had the ability to limit their family sizes. Population only increased when technological change made it possible to reduce the land area per person. Look at the times between those technological shifts, and population remains extremely stable with little to no growth for vast stretches of time. There was no Malthusian pressure on food collectors to increase their food supply. Population increases happened after they changed their technology, not before.

Another answer I've seen mooted is that agriculturalists live settled lives and that enables them to accumulate more belongings and become richer than nomads. This overlooks the vast number of settled food collecting societies, where rich natural food sources meant people could live in one place permanently and own lots of things, without becoming farmers. It also overlooks that nomadic food collectors had a home range with which they were deeply familiar, and a limited number of home camps that they visited at more or less set times, depending on the season and what food sources were due to become collectable where. They could cache belongings at those camps and not have to limit themselves to what they could carry. So they weren't necessarily as poor and bereft of possessions as the popular conception of them would have.

A lot of the links I get when googling for reasons that food collectors switched to farming focus on the *invention* of farming, and provide the suggestion that this was done because settled food collectors in naturally rich areas (like the parts of the fertile crescent where wheat farming was invented) had to either become unsettled or invent new ways to get food when the place they had been living became less rich, whether due to climate change or over-exploitation. Which is not at all in accord with what we know about the actual time lines of plant domestication, extending as they do back to the height of the last glacial period, so that the food collectors were perfecting agriculture while the climate was improving and the richness of their homeland was increasing (see my previous thinky, linked above).

To restate the question: of the food collectors who had the choice to adopt already-invented farming technology (many desert/steppe dwellers and all arctic people did not have that choice, nor did those who adopted it under the threat of assimilation), some did not adopt the new tech, or resisted doing so until colonialism/invasion took away their choice (maybe because they saw evidence of the many downsides of farming). Others accepted the choice, despite those visible downsides: why? I still haven't found a reason proposed that sits well with me. But I do have a crackpot theory of my own.

Maybe, just maybe, it was because while agriculture did not provide any real benefit to settled food collectors, it did give the *appearance* of benefit. It gave the illusion of control: it made the people who did it feel that they were better able to ward off bad times because as a farmer, you were creating your own food, instead of being dependant on the forces of nature to provide food for you. Food collecting meant being at the mercy of countless factors beyond your control or ken. Farming meant being at the mercy of just one: rainfall. It wasn't actually better than food collecting, but it felt better, because *it was less scary*, and that's why it proved so popular.

Profile

glaurung: (Default)
glaurung_quena

May 2025

S M T W T F S
    123
45678910
111213 14151617
18192021222324
25262728293031

Most Popular Tags

Syndicate

RSS Atom

Style Credit

Expand Cut Tags

No cut tags